LakTEK (Lakshan Perera) http://laktek.com A Sri Lankan, A Creator & An Explorer Wed, 22 May 2013 09:19:58 GMT en hourly 1 My First Three Months with Nitrous.IO http://laktek.com/2013/04/12/my-first-three-months-with-actionio http://laktek.com/2013/04/12/my-first-three-months-with-actionio/#comments Thu, 11 Apr 2013 16:00:00 GMT Lakshan Perera http://laktek.com/2013/04/12/my-first-three-months-with-actionio Last 3 months has been the most exciting and yet challenging period of my life. I migrated from Sri Lanka to Singapore to join with an early-stage startup called Action.IO.

I first heard about Action.IO from HackerNews, somewhere around last July. It got my immediate respect as an engineering startup emerging from Asia. Also, it was solving a real problem. I felt it could change the way we code, if executed well.

I got to meet Arun and Pete, while I was in Singapore for JSCamp.Asia. We had a casual chat over a breakfast, that spanned across Asian folk stories to OT engines. After I moved back to Sri Lanka, we had some lengthy email conversations. As a result of it, I decided to move to Singapore to join with the Action.IO team.

Personally, I'm a big fan of resources and tools that make life easy for developers. It's those great contributions from others, that makes it possible for us to come up with more awesome creations. I always try to contribute back to this culture in little ways I can. I saw working with Action.IO, could give me the opportunity to do that in a broader and more focused manner.

Operates like an open source project

After first-day at work with Action.IO team I said to myself - "Wow! These guys are super smart!".

It was intimidating to work with such a smart and productive team. But during the next couple of weeks, I realized it's the process and the environment that creates the difference.

At Action.IO, nobody will tell you what to do and how you should do it. You have to manage yourself. Company's goals and current position is briefed weekly during a stand-up. Also, you will be given access to support channels, application logs and stats from the first-day itself. Based on these inputs, you will get an idea of what needs to be done and how important it is. When you stumble upon a feature idea or a bug, you create a story for that in Pivotal Tracker. It could be either be you or someone else in the team that gets it done.

At the start of each work day, you will glance through the pending tasks in the tracker and assign a task for you to work on. If a task could span more than a day, you should simplify it by breaking it into sub-tasks. This allows you to actually deliver something at the end of every day and have a good sleep.

Pull-requests & Code reviews

For every task you work on, you should create a feature branch. Nobody commits directly to master (well, I did once). When you complete the task, you send a pull-request to merge your feature branch to master (We use GitHub private repos).

At this point, rest of the team will review your pull request and offer tips for improvement or discuss implications of the suggested implementation. Personally, these code reviews helped me to unlearn lot of bad practices and think thorough before arriving at a solution. These code reviews are highly subjective, so it's important not to entangle your ego with the code you wrote.

Also, at Action.IO there's a strong emphasize on test-driven development, documentation and managing a clean commit history. As a newcomer, these practices allowed me to get a better understanding of the previous decision making and acclimate with the code-base.

Dogfooding

We use Action.IO to build Action.IO. This makes induction relatively frictionless. On the first day, I was given access to a dev box in the cloud, which was already configured with the tool chain. All I needed to do was move in my dotfiles and configure the SSH tunneling on my local machine. Such a bliss.

Action.IO's stack is built on top of many other Open Source tools and libraries. Philosophy here is to use the right tools to get things done faster. We've contributed back with improvements to many projects we use and also have open-sourced several components used in our stack.

An office with "library rules"

Though, I had read about this on 37Signals blog, I never thought this is something possible until coming to Action.IO office.

There's no cross-chats or disturbances during work periods. All communication happens through HipChat (earlier we used Campfire), so if you don't want to get distracted you can turn off the notifications. We've hooks configured that notifies activities in Pivotal Tracker and GitHub to the room. This keeps everyone in the loop of what others are working on.

On most days, whole team goes out for lunch together (best thing about being in Singapore is the food!). These team lunches do get really noisy, with interesting arguments or just making fun of each other :)

Though there's so much activity in HipChat during the week, it goes insanely silent during the weekend. If you happen to login accidentally, you will find yourself alone in the room. Instead you're encouraged to step away from the screen and seek other hobbies/interests during the weekend.

Connecting with other developers

What I love most about Action.IO is getting to connect with lot of other developers. As I said earlier, everyone in the team has access to support channels and everyone do support. Insights you gain from these chats are invaluable. You will learn different contexts, workflows and approaches used by others. And it's flattering to hear how others make use of it to do interesting stuff.

At Action.IO, I feel I'm improving myself as a developer, while helping others to improve themselves.

PS: Action.IO is always on the look for more passiontate people to join the team. If you're yearning for a challenge, you should get in touch.

Update (17 April 2013): Company was rebranded to Nitrous.IO since this post.

]]>
The Next Generation is Here! http://laktek.com/2013/01/06/the-next-generation-is-here http://laktek.com/2013/01/06/the-next-generation-is-here/#comments Sat, 05 Jan 2013 16:00:00 GMT Lakshan Perera http://laktek.com/2013/01/06/the-next-generation-is-here On January 1st, from the usual flow of new year wishes in my inbox, a special message stood out:

Hi Aya did you use Html for coding websites ? i'm also coding websites but with Php

Where did you host ur website ? did u host on ur own sever?

It was from one of my cousins. I haven't seen him in 8 years. He was just a toddler when he migrated to France with his parents in 2004 just after the Tsunami. He's becoming a teenager this year. And he wants to make websites.

I created my first web site when I was 14. Back in the day, creating and publishing a personal homepage on Geocities was considered the coolest thing you can do on the internet as a kid. My uncle (his dad) boasts he will soon catch me up. But he lives in an era, you can get 15,000 random people to like you by just saying "hey". He can take a picture of his food, apply a Polaroid filter and publish it on the web in the matter of seconds.

Then why the hell he wants to create his own website?

For long, we debated what could be the Facebook killer. The answer is here. No, it will not be another Silicon Valley startup. It is the post-millennials - the generation my cousin belongs to. They will soon revolt against the wall-gardened social web. They will want to take the web back to its roots.

What can we do, as the veterans from the previous generation, to fuel this revolution? Should they be using a doubled-clawed hammer like PHP?. Should they be forced to stay up all night figuring out how to configure Apache and MySQL for their environment? Should they still be reading tutorials that includes hacks to circumvent the limitations of IE6 (or even know what IE6 is)?

They grew up experiencing vastly better systems. We've spoiled them with tools that are intuitive and fun to use. The web they know is so easy to consume. Can we also make it a fun platform to create? A platform which will help them to innovate better and move faster?

They deserve tools that can widen their creativity, rich resources that can help them to learn better and of course, guidance on how not to repeat the same mistakes we did.

This is where I want spend my energy in 2013.

]]>
Revisiting JavaScript Objects http://laktek.com/2012/12/29/revisiting-javascript-objects http://laktek.com/2012/12/29/revisiting-javascript-objects/#comments Fri, 28 Dec 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/12/29/revisiting-javascript-objects During the holidays, I spent some time catching up on the developments in ES6 (next version of JavaScript). While going through some of the proposals such as Minimal Class Definitions, Proxy API and Weak Maps; I noticed most of these enhancements make extensive use of the object manipulation features introduced in ES5 (ie. ECMAScript5 - the current JavaScript standard).

One of the main focuses of ES5, has been to improve the JavaScript's object structure and manipulation. The features it introduced do make lot of sense, especially if you're working with large and complex applications.

We've been little reluctant to adopt ES5 features, especially due to browser compatibility issues. We rarely see production code that make use of these features. However, all modern browsers (ie. IE9, FF4, Opera 12 & Chrome) do have JavaScript engines that implement the ES5 standard. Also, ES5 features can be used in Node.js based projects without any issue. So I think it would be a worthwhile exercise to revisit the ES5 object features and see how they can be used in real-life scenarios.

Data and Accessor Properties

ES5 introduces two kinds of object properties - data and accessors. A data property directly maps a name to a value (eg. integer, string, boolean, array, object or a function). An accessor property maps a name to a defined getter and/or setter function.

var square = {
    length: 10,
    get area() { return this.length * this.length },
    set area(val) { this.length = Math.sqrt(val) }
}

Here we have defined a square object, with length as a data property and area as an accessor property.

> square.length
  10
> square.area
  100
> square.area = 400
  400 
> square.length
  20

When we access the area property, its getter will calculate and return the value in terms of the length property. Also, when we assign a value to area, its setter function will change the length property.

Property Descriptor

ES5 allows you to have more fine-grained control over the properties defined in an object. There's a special attribute collection associated with each property, known as the property descriptor.

You can check the attributes associated to a property by calling the Object.getOwnPropertyDescriptor method.

> Object.getOwnPropertyDescriptor(square, "length")
{
    configurable: true
    enumerable: true
    value: 20
    writable: true
}

> Object.getOwnPropertyDescriptor(square, "area")
{
    configurable: true
    enumerable: true
    get: function area() { return this.length * this.length }
    set: function area(val) { this.length = Math.sqrt(val) }
}

As you can see from the above two examples - value and writeable attributes are only defined for data property descriptors, while get and/or set are defined for accessor property descriptors. Both configurable and enumerable attributes applies to any kind of property descriptor.

The writable attribute specify whether a value can be assigned to a property. If writable is false, property becomes read-only. As the name implies, configurable specifies whether the property's attributes are configurable and also whether the property can be deleted from the object (using the delete operation). The enumerable attribute determines whether the property should be visible in for..in loops or Object.keys methods.

We can modify these attributes in the property descriptor by using the Object.defineProperty method.

Object.defineProperty(square, "length", {
    value: 10,
    writable: false
});

This will make the length property in square read-only and permanently set to 10.

> square.length
  10
> square.area = 400
  400
> square.length
  10
> square.area
  100

Tamper-proof Objects

On some instances, you need to preserve the objects in its current state during the run-time without any further extensions or modifications to the properties. ES5 provides three levels of controls that you can apply to the objects.

Calling preventExtensions method will make the object non-extensible. This means no further properties can be defined for the object.

> Object.preventExtensions(square);

> Object.defineProperty(square, "text", { value: "hello" });
  TypeError: Cannot define property:text, object is not extensible.

Sealing the object, will prevent both defining of new properties and the deletion of existing properties in the object.

> Object.seal(square);

> delete square.length
  false

If we go one step further and freeze the object, it will also disallow changing the existing property values in the object. At this point, whole object effectively becomes a constant.

> Object.freeze(square);

> square.length = 20
  20 
> square.length
  10

You can use the methods Object.isSealed, Object.isFrozen and Object.isExtensible to programmatically check the state of an object.

Even though an object is protected, it would still be possible to extend its prototype. Check the following example:

var obj = Object.create({}, { onlyProp: { value: true } });
Object.preventExtensions(obj);

var proto = Object.getPrototypeOf(obj);
proto.anotherProp = true;

> obj.anotherProp
  true

Enumerations

Often, we use JavaScript objects as associative arrays or collections. On such instances, we are tempted to use for...in loops to enumerate over the properties. However, the loop will step through all enumerable properties available in object's prototypal chain, resulting with undesired outcomes.

To avoid such side effects, JSLint suggests to manually check whether the given property is defined in the object.

for (name in object) { if (object.hasOwnProperty(name)) { .... } }

ES5 provides Object.keys method, which would return an array of own enumerable properties of an object.

We can use this method to safely iterate over a property list:

Object.keys(obj).forEach( function(key) {
    console.log(key);
});

Note: Array.forEach is also a new feature introduced in ES5

Inheritance

We know JavaScript provides behavior reuse in terms of prototypal inheritance. However, lack of direct mechanism to create a new object using another object as a prototype, has been one of pet peeves in the language.

The standard way to create a new object is to use a constructor function. This way, the newly created object will inherit the prototype of the constructor function.

var Person = function(first_name, last_name) {
    this.first_name = first_name;
    this.last_name = last_name;
}

Person.prototype = {
    say: function(msg) {
        return this.first_name + " says " + msg;
    }
}

var ron = new Person("Ron", "Swanson");

If someone calls the constructor function without the new operator, it could lead to unwarranted side-effects during the execution. Also, there's no semantical relationship between the constructor function and its prototype, which could cause confusions when trying to comprehend the code.

For those who prefer to have a alternate syntax, ES5 provides the Object.create method. It takes a prototype object and a property descriptor as arguments.

Here's an alternate implementation that can be used to create Person objects, using the Object.create and module pattern.

var Person = (function() {

    var proto = {
        say: function(msg) {
            return this.first_name + " says " + msg;
        }
    }

    return {
        init: function(first_name, last_name) {
            return Object.create(proto, {
                first_name: { value: first_name, enumerable: true },
                last_name: { value: last_name, enumerable: true }
            });
        }
    }

})();

var ron = Person.init("Ron", "Swanson");

However, compared to constructor functions using Object.create could be considerably slow. So choose which implementation you want to use depending on the context and requirements.

Even if you use prefer to use constructor functions, Object.create will come in handy when you want to have multiple levels of inheritance.

var Person = function(first_name, last_name) {
    this.first_name = first_name;
    this.last_name = last_name;
};

Person.prototype = {
    say: function(msg) {
        return this.first_name + " says " + msg;
    }
};

var Employee = function(first_name, last_name) {
    Person.call(this, first_name, last_name);
}

Employee.prototype = Object.create(Person.prototype, {
    department: { value: "", enumerable: true },
    designation:{ value: "", enumerable: true }
});

var ron = new Employee("Ron", "Swanson");

We've extended the Person prototype to create the prototype of Employee.

Cloning Objects

Finally, let's see how to create a shallow clone of an object using ES5's object methods.

var clone = function(obj) {
    // create clone using given object's prototype
    var cloned_obj = Object.create(Object.getPrototypeOf(obj)); 

    // copy all properties
    var props = Object.getOwnPropertyNames(obj);
    props.forEach(function(prop) {
        var propDescriptor = Object.getOwnPropertyDescriptor(obj, prop);            
        Object.defineProperty(cloned_obj, prop, propDescriptor);
    });

    return cloned_obj;
}

Here, we retrieve the prototype of the given object and using it to create the clone. Then we traverse all properties defined in the object (including the non-enumerable properties) and copy their property descriptors to the clone.

Further Reading

If you're interested in learning more about the JavaScript objects and how to manipulate them, I would recommend you to peruse the following resources:

]]>
End of a Chapter http://laktek.com/2012/12/24/end-of-a-chapter http://laktek.com/2012/12/24/end-of-a-chapter/#comments Sun, 23 Dec 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/12/24/end-of-a-chapter Thank you Vesess!

Yesterday was officially my final day with Vesess Team.

It was 2006. I was just out of college - spending most of the day lurking on local internet user groups. If my memory serves right, it was from this random discussion thread, I first stumbled upon Prabhath Sirisena and his blog, Nidahas (unfortunately dead now). Nidahas, aroused my curiosity in topics such as Web Standards, Open Source Software and Startup Culture. It also inspired me to create my own blog.

Couple of months later, I was selected to the IT faculty of Univeristy of Moratuwa. Prabhath was also a final year student at the same faculty. Though, I always wanted to meet him in person - it didn't happen until this sensational IM conversation:

Prabhath: hi lakshan, free for a chat?

me: ya sure

Prabhath: ok, let me be frank and get on with it you know about Vesess, i guess?

me: whats up? ya your company ni

Prabhath: right, so we're on the lookout for young prospects. Could you tell me your current "availability" so to speak, so i can do my dirty pitch?

me: i wud luv to join with vesses

That decision was no brainer. Vesess had already built-up a reputation for its quality work and seemed they are aiming at a different level compared to other Sri Lankan web design agencies. I knew this would be the perfect place for a freshman like me to groom my career.

Few days later, I went to a nice little office in Horton Place. I was welcomed by two lanky pleasant looking guys, who seemed to be in their early twenties. They happened to be the two co-founders of Vesess. That was my first interaction with Lankitha and Prabhath in meatspace.

Part of orientation at Vesess, involves answering a questionnaire called "Think First". It's a guide which both yourself and the company can use to asses your progress in the long run. One of the questions in it was:

How would you like to describe yourself in 5 years from now? Can you explain this with targets?

This was my answer to it:

I would be a team leader of a small start-up that will make huge impact on the society, by developing products that would simplify the lives of many.

Back then, I had no idea how realistic that dream was. But today, 6 years down the line I can proudly say I have achieved that exact goal with CurdBee.

Beginning of 2008, Lankitha wanted to diversify the company from being solely a web design agency to a product company. CurdBee, was our first attempt in reaching this goal. I was released from client projects to fully focus on the development of CurdBee.

There were lot of unknowns when we started. How to monetize the product? How to get a payment gateway and a merchant account from Sri Lanka? How to market the product? How to get popular tech blogs to cover us? How to provide support? How to scale the application? How to make it secure? How to retain the users? How to find talent?

During the last 4 years, we worked hard to find answers for these questions. Current state of CurdBee, proves we have answered them in the best way we could. It wasn't a overnight success. We had many arguments, sleepless nights, moments of panics, tears and bunch of small victories along the way.

Once you’re successfully past 1.0, you have a choice: coast and die, or disrupt. No one in history has ever actually chosen coast and die; everyone thinks they’re choosing the path of continued disruption, but it’s a very different choice when it’s made by a Stable than by a Volatile. A Stable’s choice of disruption is within the context of the last war. They can certainly innovate, but they will attempt to do so within the box they bled to build. A second-generation Volatile will grin mischievously and remind you, “There is no box.”

This beautiful essay by Rands is a perfect summary to explain my decision to move away from Vesess. I still have that desire to disrupt, but I believe the time has come for me to pick a new battle. I love getting into adventures full of challenges and uncertainties. Those are the environments that brings the best in me. I prefer being a volatile than a stable. You will soon get to hear more about my next moves.

The army I created at Vesess, is now ready for the limelight. There's a perfect blend of volatiles and stables, who could continue to disrupt. I'm sure they will take CurdBee to the next level and innovate better than in my reign.

I'm forever grateful for Lankitha for believing in me and helping to reach my dreams. He always preferred to remain behind the scene, setting the stage for us to shine. You are a great leader, friend and a brother. Prabhath, Mahangu, Laknath, Sameera, Amila, Lahiru, Asantha, Asanka, Anushke, Ramindu and Mahinda; it was a pleasure working with each one of you and I enjoyed every moment of it.

Thank you Vesess for everything!

]]>
JSCamp Asia http://laktek.com/2012/12/04/jscamp-asia http://laktek.com/2012/12/04/jscamp-asia/#comments Mon, 03 Dec 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/12/04/jscamp-asia Singapore city view at night

Last week I was in Singapore, to attend JSCamp.asia. It was the first full-scale JavaScript conference to be staged in this region and was nothing less of the vibrant, insane spirit you expect from a JavaScript conference.

JSCamp happened to be the first international conference I was selected to give a talk. My talk was titled "Embracing the Static Web". It touched on the challenges we face as web developers in this region and the need of creating tools to overcome these challenges, while creating modern web experiences. In the latter part of the talk, I explained how these needs motivated me to create Punch and shared how it fits the bill as a modern web framework. I received lot of encouraging feedback and remarks after the talk. But I believe I could have done it better and this experience will immensely help when I prepare for future talks.

Here's the slide deck for the talk.

Update: Video of the talk is now online.

JSCamp gave me the opportunity to meet and hangout with many interesting and respected personalities in the JS (and web tech) community. Seriously, they are so smart and passionate in their game. Also, I couldn't ignore the fact how genuine and honest they were, when it comes to sharing their knowledge and experiences. Jed, Divya, Angus, Jan, Michal, Tim, Alex, Tomasz, Eric and John/Zackery of GitHub, you guys are awesome!

Also, I got to meet lot of developers based in Singapore and others who came specifically for JSCamp from countries like Thailand, Indonesia, Vietnam, South Korea and Philippines. They seems do lot of interesting stuff and I gathered lot of insights by talking to them. Most of them have followed traditional educational paths (which is common in Asia) and now eagerly switching their gears to expand their knowledge into web technologies.

Overall, I feel there would be a huge bloom in the web tech industry and related startups in this region in the coming years. Especially, Singapore seems to got the right kind of essence to ignite such a trend. However, it's still an infant when compared to eco-system in Silicon Valley. But with more experiences and inspiration, I'm sure things will start to change rapidly.

That's why we need more events like JSCamp in this region. I expect to see others also taking this challenge as Thomas and Sayanee did. Great job guys!

]]>
Punch based Boilerplate to Power Your Blog http://laktek.com/2012/11/26/a-fast-intuitive-blogging-tool-based-on-punch http://laktek.com/2012/11/26/a-fast-intuitive-blogging-tool-based-on-punch/#comments Sun, 25 Nov 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/11/26/a-fast-intuitive-blogging-tool-based-on-punch When I first created Punch, I didn't intend it to be used as a blogging tool. This blog was running on Jekyll during that time and I was content with the setup. However, as I was switching back and forth between Punch based projects and this Jekyll based blog, it became apparent how Jekyll's workflow misses the simplicity and intuitiveness I have in Punch. So I wanted a Punch based extension for blogging.

I wrote a special blog specific content handler which can be added to any Punch based site. It also preserves lot of Jekyll's conventions such as the post structure and permalinks. This makes transition of a blog from Jekyll to Punch really smooth.

Last week, I switched this blog from Jekyll to a Punch based setup. I'm happy how it turned out. Workflow of creating, previewing and publishing a post is now much faster and intuitive. You can check the source code of the project to get an idea how it is structured.

Since lot of you were asking about the possibility of using Punch to power a blog, I thought it would be useful to share my workflow with you. So I decided to release a boilerplate based on my setup, which you can use to create your own blog.

Intro Sscreen
Screenshot of Punch Blog Boilerplate

Here are some of the cool features available in this boilerplate:

  • Preview posts, as you write them.
  • Easily publish to Amazon S3.
  • Pretty URLs for permalinks (no .html, configurable).
  • Responsive, customizable theme based on HTML5Boilerplate & 320andup framework.
  • Load fonts from multiple sources with WebFonts Loader.
  • Easily configure Google Analytics, Tweet button & Disqus comments.
  • Highlighting the current page link.
  • Post archives by tags.
  • Post archives by year, month or date.
  • Write posts using GitHub flavored Markdown.
  • Client-side code highlighting with Prism.js
  • Published/draft states.
  • Automatically minifies and bundles JavaScript/CSS.
  • RSS feed
  • Sitemap.xml

Also, you can use any other features available in Punch.

  • Manage other pages with Punch's default content handler.
  • Extend the behavior by writing your own helpers.

You can download the boilerplate from GitHub. After downloading it, follow the setup instructions specified in the README. It's very easy to customize and extend.

Feel free, to open a ticket in GitHub issues if you run into any bugs.

]]>
Embrace the Static Web with Punch http://laktek.com/2012/11/04/embrace-the-static-web-with-punch http://laktek.com/2012/11/04/embrace-the-static-web-with-punch/#comments Sat, 03 Nov 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/11/04/embrace-the-static-web-with-punch This was originally published in the November edition of Appliness digital magazine.

Remember the very first web sites we created? It was just bunch of HTML files that we uploaded to a remote host using our favorite FTP program. It was all static and it just worked. Yet it was boring. Visitors wanted sites that surprised them every time they hit refresh. Moreover, we also got tired by the slow and cumbersome process we had to follow every time to update the site. Editing content within a HTML tag soup was a chaos. We broke the sites because we forgot to close some nested tag.

Along with the popularity of the LAMP (Linux, Apache, MySQL and PHP) stack, came the Content Management Systems. They seemed to be the right way to manage a web site and it didn't take long to become the de facto. CMSs allowed us to separate the content from the presentation and update our sites with just couple of mouse clicks. Anyone could run a site, even without knowing HTML.

However, as our sites grow and starts attracting more traffic, we see the shortcomings of CMSs. They become slow because they render the pages for each request. You need to tune the database and servers to handle the load. To add a trivial new feature to the site you need to modify the internals of the CMS (which is often a spaghetti code). Further,they are full of vulnerabilities. Remember the day your site got hacked, because you missed one of those daily security updates? Managing a web site seems to take up your life and become a chore.

On times like this, we start to feel nostalgic about the static web sites we created. "It was just bunch of HTML files. But it worked!".

This inspired me to write Punch, which brings back the simplicity of static web, along with the conveniences of content management. There's no server-side code to run, no databases to configure, no mix up between HTML and the content. You can resort to your favorite tools to create the site locally, preview it in the browser and finally publish the changes to any remote host using a simple command-line interface.

It's better to understand the concepts of Punch with a simple real-life example. Let's create a site using Punch to share your favorite books with others. We shall call this project as the "Reading List".

If you are in a hurry, you can check the final result from here and download the source code from GitHub.

Installing Punch

Before we start the tutorial, let's install Punch. To run Punch you will need Node.js. Make sure you have installed Node.js (version 0.8+) on your machine. Then open up your Terminal and type:

    npm install -g punch

This will install Punch as a global package, allowing you to use it as a shell command. Enter the command punch -v to check whether Punch was installed properly. This tutorial was created using Punch version 0.4.17.

Setup a New Site

Let's spin off a new project for our Reading List. By running the command punch setup, you can create the project structure with essential files.

    punch setup reading_list

This will create a directory named reading_list. Inside it we will see another two directories named templates and contents. Also, you will find a file named config.json. You will learn the purpose and role of these directories and files as the tutorial progress.

While we are inside the project directory, let's start the Punch server by running the command:

    punch s

This will allow us to preview the site we create in real-time. By default, the server starts on the port 9009.

Open your browser and enter the URL http://localhost:9009. You should see the welcome screen along with a link to a quick hands-on tutorial. I highly recommend you to take couple of minutes to go through this quick tutorial first, which will help you to grasp the core concepts of Punch. I'll wait till you finish it.

Intro Sscreen

Preparing the Layout

In the quick hands-on tutorial, you learnt Punch uses Mustache as the default templating language. Also, you learnt the layouts, partials and static assets that composes a site's user interface must be saved inside the templates directory.

Make sure you removed the {{{first_run}}} from the templates/_footer.mustache to get a clean view sans the hands-on tutorial.

Now let's turn our focus back to the Reading List page we are creating. It should contain the following information:

  • Introduction
  • List of Books (we must provide the following information for each book)
    • Title
    • Cover image
    • Author
    • ISBN
    • Your rating
    • Favorite quote from the book
    • Link to the book in Amazon

We only need to create a single web page to show these information. So we can directly customize the default layout (templates/_layout.mustache) to create the view we need.

{{> header }}

        <div role="main">
            <p>{{{intro}}}</p>
            <div id="books">
                {{#books_list}}
                    <div class="book">
                        <h3><a href="{{amazon_link}}">{{{title}}}</a></h3>
                        <div class="cover">
                            <a href="{{amazon_link}}"><img src="{{cover_image}}"></a>
                        </div>
                        <ul>
                            <li><b>Author</b> - {{author}}</li>
                            <li><b>ISBN</b> - {{isbn}}</li>
                            <li><b>Rating</b> - {{rating}}</li>
                            <li><b>Favorite Quote</b> - {{{favorite_quote}}}</li>
                        </ul>
                    </div>
                {{/books_list}}    
            </div>    
        </div>

{{> footer }}

Note that some Mustache tags are surrounded with three curly-braces, while others are surrounded with two curly-braces. By having three curly-braces we say Mustache not to escape the HTML within the content. In places where you want to have HTML formatted content, you must use the tags with three curly-braces.

After modifying the layout, refresh the page in the browser to see the changes.

Creating the Reading List in JSON

Still you won't see any visual difference in the page. But if you view the source of the page, you will see the HTML structure you defined with empty values in the places where there were Mustache tags. We must provide content to render into those tags.

Let's start with the most essential piece of content of the page - list of books. Open the contents/index.json and start entering the details of your favorite books in the following format.

{
    "book_list": [
        {
            "title": "The E-Myth Revisited",
            "amazon_link": "http://www.amazon.com/gp/product/0887307280",
            "cover_image": "http://ecx.images-amazon.com/images/I/41ieA7d6CYL._SL160_.jpg",
            "author": "Michael E. Gerber",
            "isbn": "0887307280",
            "rating": "10/10",
            "favorite_quote": "\"The true product of a business is the business itself\""
        }
    ]
}

We've defined a JSON array named book_list which contains multiple book objects. For each book object, we define the required details as properties.

Save the file after entering the books and refresh the browser. You should now see the book list you just created rendered into the page as HTML.

You can continue to add more books or update the existing entries in the contents/index.json. The page will be rendered every time you make a change in the content.

Writing the Introduction Text using Markdown

So now we have listed our favorite books, let's add a simple introduction to the page. Rather than defining it as a JSON string, you can use Markdown formatting to write this piece.

When fetching contents for a page, Punch will look for extended contents such as Markdown formatted texts, in a directory by the name of the page prefixed with an underscore. This directory must be placed inside the contents directory along with the JSON file for the page.

In this instance, we should create a directory named _index and save our introduction inside it as intro.markdown. The filename of an extended content should be the tag name you wish to use in the templates to retrieve that content.

Changing the Site's Title

You will notice site's title is still displayed as "Welcome". Let's change that too. Site-wide content such as the site's title, navigation items are defined in the contents/shared.json file. Open it and change the site's title to "Reading List".

Styling with LESS

Now we are done preparing the content, let's do some style changes to beautify the page. You can use LESS to write the styles and Punch will automatically convert them into regular CSS.

As I mentioned previously, all static assets such as stylesheets, JavaScript files and images must be stored in the templates directory. You can organize them in any way you like inside the templates directory.

You will find the default site specific styles in templates/css/site.less. Let's change them to achieve the design we want for Reading List. To keep this tutorial concise, I won't show the modified stylesheet here. You can check it from the project's repo on GitHub:

Similar to processing the LESS files, Punch can also pre-compile CoffeeScript files into JavaScript automatically.

Minifying and Bundling CSS/JavaScript Assets

Minifying and bundling of CSS & JavaScript assets are recommended performance optimizations for all kinds of web sites. Those help to reduce the number of round-trips browsers needs to make in order to fetch the required assets and also minimizes the size of the assets that needs to be downloaded.

Minifying and bundling assets in a Punch based project is fairly straightforward. You only have to define your bundles inside the config.json. Then Punch will prepare and serve the minified bundles at the time of generating the site.

We can bundle the CSS files used in the project like this:

"bundles": {
    "/css/all.css": [
        "/css/normalize.css",    
        "/css/main.css",
        "/css/site.less"
    ]    
}

Then, you can use Punch's bundle helper tags to call defined bundles inside the templates.

<head>
    <!-- snip -->
    {{#stylesheet_bundle}}/css/all.css{{/stylesheet_bundle}} 
</head>

This will produce a fingerprinted stylesheet tag (useful for caching) like this:

<head>
        <!-- snip -->
        <link rel="stylesheet" type="text/css" media="screen" href="/css/all-1351313179000.css"> 
</head>

Similar to stylesheet_bundle tag, there's a javascript_bundle tag which you can use to call JavaScript bundles from a page.

Publishing to S3

Our Reading List page is now almost complete.

Finished Site

Let's share it with others by publishing on the web. Punch allows you to either publish your site directly to Amazon S3 or upload it to a preferred host using SFTP.

In this tutorial, we will publish the site to Amazon S3. You will have to signup with Amazon Web Services and enable the S3 storage service for your account. Then, create a S3 bucket and a user who has access to bucket.

Enter those settings in the config.json file of your Punch project under the section name publish.

    "publish" : {
        "strategy" : "s3", 
        "options" : {
            "bucket" : "BUCKET",
            "key" : "KEY",
            "secret" : "SECRET",
            "x-amz-acl": "public-read"
        }
    }

Then on the terminal, run the following command:

    punch p

This will publish the Reading List site you just created to S3. Point your browser to the URL of your S3 bucket and you should see the site.

You can check the sample site I created by visiting this URL: http://readlist.s3-website-us-east-1.amazonaws.com

In future, if you want to add a new book or update an existing entry, all you need to do is edit the list in contents/index.json and then run the command publish p. It will publish only the modified files from the last update.

Extending Punch

In this tutorial, I covered the basic workflow for creating and publishing a site with Punch. You can easily extend and customize it further based on your requirement.

For example, you can implement your own content handler to enable sorting of the book list by different criteria. Also, if you prefer to use a different templating language than Mustache, you can write your own template engine wrapper to Punch (check out the implementation of Handlebars engine).

If you're interested in learning more about the available features in Punch and how to extend them, you can refer the Punch Guide.

]]>
My Story of 1996 World Cup http://laktek.com/2012/10/07/how-i-remember-1996-world-cup http://laktek.com/2012/10/07/how-i-remember-1996-world-cup/#comments Sat, 06 Oct 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/10/07/how-i-remember-1996-world-cup Every Sri Lankan you meet today will have only one topic to talk. The T20 World Cup finals. You will hear them saying how Malinga can silence Chris Gayle, what makes Mahela the best captain in the game, why Sri Lanka couldn't win the cup on last 3 occasions and of course, why 1996 world cup victory was so special. No matter how hard you try, 1996 world cup victory will keep popping up in any cricket related conversation with a Sri Lankan. Every one has their own story to share about the 1996 victory.

Here's mine.

It was another eventful day in the new school year. We were sitting for the Grade 5 scholarship exam that year, everyone seemed to be more enthusiastic about the studies than before. Just after the interval, some were rushing around teacher's table to get their books corrected, some were fighting in the corner and some of us were just chatting. "BOOM!", suddenly a noise felt like a thunder heard from some distance. We event felt a tremor in our old classroom building. Crows flocked in the near-by banyan tree seemed to be frightened. There was a pin-drop silence in the class, which was nosier than a Sunday market moments ago.

"That's a bomb." someone murmured. We knew it was close, but had no idea where. Teacher tried to calm down the class and continue with the lesson.

About in a hour, someone came into our class room to meet the teacher. It was my dad. His shirt looked brownish. His otherwise perfectly combed hair, looked messy. I couldn't hear what he was talking with the teacher. After the conversation, teacher came to me and asked to get ready to go home. I had no idea what's happening. Then a huge feeling of fear entered my mind. My legs started to shiver. "Where's mother??" I asked dad hurriedly walking out of the class. "She's there." He said. For a moment, I thought he was lying, something terrible must have happened. Why he's taking out of the class early?

Dad wasn't lying. Mom was there down the stairs with my younger brother. She has gone to pick him from the class as well. Her face looked pale. There wasn't any expressions. I still had no idea what's happening. "Why your shirt looks dirty? Did you had an accident??" again I asked dad. "No, I'm alright."

Only when we returned home and switched on the TV, I realized what has happened. The LTTE have carried out a brutal attack in Fort, the commercial center of Colombo. A lorry filled with explosives was crashed into the Central Bank, leaving only debris of it. The Ceylinco Building, where my dad worked, was burning in flames after being fired with RPGs. Glasses were shattered in the WTC and Bank of Ceylon buildings. Latter is where my mom worked. It said forces were able to hunt down the carders before they launched attacks into those buildings.

The attack killed 91 people and left thousands permanently injured. It was a miracle that both of my parents survived from one of the most outrageous terrorist attacks in the world. But they seemed to be moving on with the daily chore. Maybe they were too shocked or were just trying to hide the gravity of the incident from us. Later, dad told those were blood stains on his shirt, from carrying the injured people out of the building.

The fear that entered my mind was still there. I couldn't sleep. I had nightmares. I couldn't erase those images of burnt bodies, people jumping out of flaming buildings and kids moaning over their parents' coffins. That could have been my parents. I feared there will be more attacks. I wanted my parents to stay at home without going to work.

Only solace for me was going to school. There I was able to be with friends, engage in different activities and forget these fearful thoughts. However, one morning the government announced the schools will be closed indefinitely considering the prevailing security condition in the country. It was rumored the terrorist leader Prabhakaran has proclaimed, he will make thousands of mothers weep in South.

Being able to stay at home without going to school would have made any 10-year old happy. But it actually made me sad and crazy. Without anything to occupy my mind, now I have to be at home all day with fear and suspense of another bomb and losing my parents.

Also, on the same day the schools were closed, the Cricket World Cup started. This time it was played in the sub-continent. Jointly hosted by India, Pakistan and Sri Lanka. All the matches were shown on TV. Bored without nothing to do at home, I decided to watch every single match.

As the tournament started, Australia announced they will not tour Sri Lanka considering the security risks. Soon West Indies followed the same path. It was a major blow for the tournament and more importantly it was a huge disappointment for the Sri Lankan fans, who waited for this historical occasion. At this time, cricketers from India and Pakistan stepped up to form a Friendship XI and play a match in Colombo to support Sri Lanka. Nobody would have believed these arch-rivals could share a one dressing room and hug each other after taking a wicket, until this match. I remember Wasim Akram saying he forgot his kit bag and was wearing one of Azharrudin's shirts for the match. I felt this was the real power and beauty of sports.

Non of the early setbacks, had any effect on Sri Lankan team's performances. Actually it appeared those incidents have made them stronger. They won all of their group matches comfortably and qualified for the knock outs. In the match against Kenya, Sri Lanka smashed 398 runs in the 50 overs, which was the world record at that time.

Meanwhile, I was getting fully immersed with the excitement of the world cup. I started writing down the scores of every single match in an exercise book. And even calculated the statistics for each cricketer. When a player's stats were shown on TV, I compare them with my records to see if those would tally. I remember reverse engineering how the batting average and strike rate were calculated. It took me some time to realize that I have to subtract the not outs from the innings batted before taking the average.

Instead of fearful nightmares of bombs, I was seeing dreams of Sanath playing his square cuts and Aravinda playing his pulls. My morning prayer changed from "God let there be no bombs today" to "God help us to win the match today".

We reached to the semi-finals of the World Cup. The match was played against India in Eden Gardens, Calcutta. Stadium looked packed. It felt like a battle between 11 Sri Lankans against 100,000 Indians. India won the toss and made Sri Lanka bat first. So far Sri Lanka have won all their matches chasing. They haven't batted first in any of the matches before. Then in the first over itself, Indians sent Sri Lankan openers Sanath and Kalu back to the pavilion. It was the explosives starts the duo produced, which helped Sri Lanka to win all their previous matches. That was a mighty blow. However, then came Aravinda to play one of the best knocks I have ever seen to this date. He scored a brisk 66 runs, but his stroke play was scintillating. Rest of the batsmen built on the foundation laid by Aravinda and took Sri Lanka to a respectable 250 runs.

Indians were going comfortably in their chase. But all changed when Kalu created a stumping opportunity out of nowhere to Sanath's bowling, which sent Sachin Tendulkar back to the pavilion. It was like a payback for what they missed with the bat. Fall of Sachin, made India crumble. The scoreboard which read 100 for 1 moments ago, now read as 120 for 8. There was no chance for India to win the match from that point.

Suddenly, the game was halted. It seemed some of the Sri Lankan players were hit with water bottles from the stands. Then we could see people trying to set fire to the stands. The fear started to creep into my mind again. Will they hurt our players?

As the riots in the stadium got stronger, they decided to end the match and award the victory to Sri Lanka. TV cameras were focused on a person who was holding a banner in the middle of flames, which read "Congratulations Sri Lanka! We are sorry". It was apparent the anger was not against the Sri Lankans, but against their own team. Azhar who was worshiped as an idol few weeks ago, had to have a sad and disgraceful ending to his career.

There was no better ending to Sri Lanka's fairy-tale journey in the World Cup, than meeting Australia in the finals. It was them who called Murali, a chucker during the boxing day test match. Sri Lankans believed Aussies knew Murali was going to be a threat to the world cricket in the year's to come and wanted to end his career early. Also, Murali was the only Tamil cricketer in the Sri Lankan team and everyone treated him as a national pride. The distaste of Aussies grew further when they forfeited the group matches in Sri Lanka.

So this looked as the perfect occasion to take the revenge from the Aussies. Unlike Aussies, Sri Lankans didn't chose any bastard tricks. Instead Sri Lankans replied by playing the gentleman's game and totally outclassing Aussies in all departments of the play. Whole city turned into a one big party when Sri Lankan skipper Arjuna scored the winning runs with a boundary.

We lit firecrackers. I ran around the house.

Few days after the World Cup, schools were started again. We carried bats and balls to the school and was allowed to play cricket during the P.E. period instead of those boring exercises. I tried to walk down the wicket and hit a six over long-on like Sanath, but only to get bowled by my friend, Prabhath who sent off-spinners like Murali.

Every one of us had a new dream!

]]>
Simple Helper to Extract Values from a String http://laktek.com/2012/10/04/extract-values-from-a-string http://laktek.com/2012/10/04/extract-values-from-a-string/#comments Wed, 03 Oct 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/10/04/extract-values-from-a-string While writing a blogging engine based on Punch, I needed to implement a way to extract values from a path based on a permalink pattern defined by the user.

After writing a helper function to solve this case, I realized it can be useful in other similar cases too. So I extracted it to a its own package named, ExtractValues.

Here's how you can use it:

var extractValues = require('extract-values');

extractValues('/2012/08/12/test.html', '/{year}/{month}/{day}/{title}.html')
>> { 'year': '2012', 'month': '08', 'day': '12', 'title': 'test' }

extractValues('John Doe <john@example.com> (http://example.com)', '{name} <{email}> ({url})')
>> {'name': 'John Doe', 'email': 'john@example.com', 'url': 'http://example.com' }

extractValues('from 4th October  to 10th  October',
                'from `from` to `to`',
                { whitespace: 1, delimiters: ['`', '`'] })
>> {'from': '4th October', 'to': '10th October' }

extractValues('Convert 1500 Grams to Kilograms',
                'convert {quantity} {from_unit} to {to_unit}',
                { lowercase: true })
>> {'quantity': '1500', 'from_unit': 'grams', 'to_unit': 'kilograms' }]

Options

From the above examples, you may realize the helper function accepts several options. Let's see what those options mean.

whitespace - normalizes the whitespace in the input string, so it can be aligned with the given pattern. You can define the number of continuous whitespaces to contain in the string. Making it zero (0) will remove all whitespaces.

lowercase - converts the input string to lowercase before matching.

delimiters - You can specify which characters are the value delimiters in the pattern. Default delimiters are { and }.

This is intended to be used for matching trivial and definite patterns, especially in contexts where you want to give the option to end-users to provide patterns. For more complex and fuzzy matching, you would be better off with regular expressions.

You can check the source in GitHub or directly install it from the NPM (npm install extract-values).

P.S.: There were some great suggestions and alternate language implementations in the HackerNews discussion for this post.

This helper since ported to several other languages:

]]>
Punch Reloaded http://laktek.com/2012/09/18/punch-reloaded http://laktek.com/2012/09/18/punch-reloaded/#comments Mon, 17 Sep 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/09/18/punch-reloaded Punch started out as a simple static site generator, which I wrote to use in my own work. After releasing to the public, it attracted bunch of passionate early adopters, who provided some valuable feedback on how to improve it. Their feedback gave me a better perspective of what people really expect from a modern-day web publishing tool.

So in last July, I took two weeks off from my day job, to create the initial prototype for a new Punch. Then for the next two months, I spent most of my early mornings and late nights turning this concept into a reality.

Today, I present you Punch Reloaded, which I deem as a modern web publishing framework.

Everything is modular

Problem with the web development is everything change so fast. Devices change. Browsers change. Best Practices change. The tools we use, need to change as well.

However, this rapid pace of change is not just peculiar to web development. It's been a characteristic of the computing field from its inception. If there was a system, which understood and embraced this concept of rapid change, that was Unix. Unix's followed a modular design, which allowed it to evolve with the time and needs.

Punch was inspired a lot from this Unix philosophy. The framework is composed of small self-contained modules, that's been written to do only one particular task. You can take apart everything and put together the way you want. You can easily extend the default stack to support any template engine, parser or pre-compiler you want. You can define the way you want to handle the content and layouts. You can even embed Punch within a large application.

To learn more about the possibilities, refer the sections under Customizing Punch in the Punch Guide.

Power your site with any JSON backend

By default, Punch fetches content from JSON files stored in the local file system. However, with the new modular design, you can use any backend that supports JSON, to power your Punch sites. It could be a relational database such as Postgre, a document store such as Mongo, a third-party web service or even new backend service like Firebase.

Check out this example app, which uses the Twitter API as its backend. I plan to release plugins for several popular backends in the near future. If you are interested in writing your own content handler, please check this article.

Flexible, inheritable layouts

With most Content Management Systems you are stuck with a single layout for all pages in a site. But Punch makes it easy to use different layouts for different sections and pages in the site. Unlike in other frameworks, you need not to explicitly define the layout you want to use. Punch will intuitively fetch the matching layout for a page, based on the name and hierarchy.

Read this article, to learn more about the organization of layouts.

Spice up the templates with helpers

Punch offers built-in helpers for common text, list and date/time operations. They are defined in a way that can be used with any template language.

What's more interesting would be the option to define your own custom helpers. Custom helpers do have the access to request context and can be used to enhance both response header (eg. Setting cookies) and the body. You can learn more details about writing custom helpers from this article.

Minify and bundle assets

In these days of mobile web, performance is a key factor for kind of web site. Minifing and bundling of JS/CSS assets are regarded as common best practices to increase the performance of a site. Apart from web app frameworks like Rails, most other site management workflows doesn't adopt such practices.

However, Punch offers a smart workflow to minify and bundle assets. So you can manage the front-end code, without giving up on the clarity or performance.

CoffeeScript and LESS out of the box

With Punch, you can write CoffeeScript and LESS as you would write regular JavaScript and CSS. Make the changes in the editor and just reload the browser to see the changes. Punch will do the compiling for you.

Check this screencast, to understand how smooth the flow is:

Easy to get started!

Punch now comes with a default site structure and a hands-on tutorial to help you get started. Once you get acquainted, you can even use your own boilerplates to create new a site.

I have also rewritten the Punch Guide, to help you to peruse all the new features.

Make it your own...

As I said before, you can extend Punch's default stack by writing plugins. It's possible to write wrappers to use any backend, template engine, parser, pre-compiler and minifier with Punch. Basically, you can use Punch as a glue code, to build your own framework.

Reading the source is probably the best way to understand what happens under the hood. Also, you can refer the specs to get a better idea on the actual behavior.

Go, Give it a try and create some kickass sites!

]]>
Distraction Free Writing with Vim http://laktek.com/2012/09/05/distraction-free-writing-with-vim http://laktek.com/2012/09/05/distraction-free-writing-with-vim/#comments Tue, 04 Sep 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/09/05/distraction-free-writing-with-vim I recently configured Vim to switch to a special writing mode upon opening Markdown files. It helps me to keep the focus while writing avoiding distractions. I first saw this concept in iA Writer and immediately fell in love with it. However, my muscle memory was too tied with Vim and I didn't want to switch to a different editing environment solely for this purpose. Hence, I tried to make Vim to behave in a similar manner.

Check this screencast to see it in action:

For anyone interested, these are the steps I took to setup it. Please note that I tried this with MacVim on OS X Lion. It may not work as expected in other versions or other OSs.

    " turn-on distraction free writing mode for markdown files
    au BufNewFile,BufRead *.{md,mdown,mkd,mkdn,markdown,mdwn} call DistractionFreeWriting()

    function! DistractionFreeWriting()
        colorscheme iawriter
        set background=light
        set gfn=Cousine:h14                " font to use
        set lines=40 columns=100           " size of the editable area
        set fuoptions=background:#00f5f6f6 " macvim specific setting for editor's background color 
        set guioptions-=r                  " remove right scrollbar
        set laststatus=0                   " don't show status line
        set noruler                        " don't show ruler
        set fullscreen                     " go to fullscreen editing mode
        set linebreak                      " break the lines on words
    endfunction
  • I also added this setting to .vimrc, to toggle SpellChecking in normal mode.
    :map <F5> :setlocal spell! spelllang=en_us<CR>
  • Installed the Cousine font. It's a free alternative to Nitti Light, the font used by iA Writer.

  • Turned off Mac OS X's native full-screen mode for MacVim (otherwise the custom background color is not applied).

    defaults write org.vim.MacVim MMNativeFullScreen 0

You can find the customized versions of all needed files in this Git repo.

Update: I extracted the distraction free mode settings into its own plugin and allowed it to be toggled from the F4 key (now it won't be forced upon opening Markdown files). Check the repo in GitHub for updated settings.

]]>
SPDY for Rest of Us http://laktek.com/2012/07/10/spdy-for-rest-of-us http://laktek.com/2012/07/10/spdy-for-rest-of-us/#comments Mon, 09 Jul 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/07/10/spdy-for-rest-of-us There had been lot of buzz around SPDY lately, which is a brand new protocol introduced by Google to make the web faster. Most major browsers and web servers have begun to make releases with SPDY support. So I felt it's worthwhile to explore more about SPDY and how it matters to us as web developers.

What is SPDY?

The concept of world-wide web was developed on top of the HTTP protocol. It's an application layer protocol that uses TCP as the transport layer (if this layered concept confuses you, read about the OSI model ). HTTP is stateless; which means for each request, client has to create a new connection with the server. Though it initially helped to keep things simple, it causes issues with current demands of the web.

SPDY is designed to address these shortcomings in the HTTP protocol. It uses a single connection for all request/response cycles between the client and server.

Core of SPDY is the framing layer which manages the connection between two end-points and the transferring of data. There can be multiple data streams between the two end-points. On top of the framing layer, SPDY implements the HTTP request/response semantics. This gives us the possibility to use SPDY to serve any existing web sites with little or no modification.

What are the benefits of using SPDY?

  • Since there's no connection overhead for each request, response latency would be low.

  • Apart from the content body, SPDY also compresses the request/response headers. This will be useful on mobile and other slow connections, where every byte would count.

  • The way SPDY is designed mandates to use SSL for all communications between client and server, which means the sites would be more secure.

  • Rather than waiting till client initiate a request, server can push data to client with SPDY. This will help the sites to do smart prefetching of resources.

Is it production ready?

Many Google's apps and Twitter already uses SPDY to serve their content. Amazon Silk browser is also known to use SPDY powered proxy.

SPDY sessions in Chrome

Visit the following URL in Chrome, to see which sites are currently using SPDY.

    chrome://net-internals/#spdy

How can I add SPDY support for my site?

Easiest way to add SPDY support is by enabling mod_spdy for Apache. Mod_spdy alters Apache's default request/response cycle via hooks and add SPDY handling.

Speed gains by switching to SPDY may vary depending on the existing configurations of your site. If you are using HTTP performance optimizations such as wildcard asset hosts (ie. assets%d.example.com), that could cause SPDY to create multiple connections with the same end-point, thus reduce the efficiency. Some browsers like Chrome handles this smartly by pooling the connections. Also, CDNs such as Amazon Cloudfront still doesn't support SPDY, so those resources needs to be loaded using HTTP connections.

You can use a tool like Chromium Page Benchmarker, to check how your site performs with and without SPDY under different configurations.

Screenshot of Page Benchmarker

Do I need to have a SSL certificate?

In the initial connection, mod_spdy uses SSL handshake and Next Protocol Negotiation (NPN) to notify SPDY availability to the client. Also, to work across existing HTTP proxies SPDY requires data streams to be tunneled through SSL. This means currently you will need to have a valid SSL certificate for your site to support SPDY.

However, it is possible to test SPDY locally without SSL. For this, you can run Chrome with the flag --use-spdy=no-ssl and you may use a SPDY server implementation that works without SSL.

Does SPDY help in building real-time web apps?

It's worth noting that SPDY doesn't provide any special benefits for the use of WebSockets. Though, they might look similar in the higher level descriptions, they are totally independent protocols. They are created for different purposes and even the internal framing algorithms of the two are different.

On the other hand, SPDY's inherently asynchronous streams will help in implementing features such as Server-Sent Events.

SPDY is not a silver bullet!

SPDY will continue to improve as a more stable protocol and with the time it will succeed the HTTP protocol. Unless you run a heavyweight site, there won't be any immediate effect by supporting SPDY to your conversions or cost savings.

So it's always better to start with the general page optimization techniques and then consider SPDY if you still want to cut down those extra milliseconds.

Further Reading:

]]>
Punch Status - Instant Previews, Easy Publishing & More http://laktek.com/2012/06/30/punch-status http://laktek.com/2012/06/30/punch-status/#comments Fri, 29 Jun 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/06/30/punch-status It's been 2 months since I originally released Punch. I was encouraged from the initial feedback it received and ben commiting most of my free time to improve it further. In the past two months, Punch made lot of progress and there were several nice features added to the project.

Punch focus on making a simple and better workflow to create, preview and publish websites. This new quick screencast will help you to get a better understanding on Punch's workflow:

Here are some of the important features that were introduced recently.

Instant Previews

Punch now ships with a built-in development server. This will re-generate the site before each page request, making it possible to preview changes in the browser instantly.

To start the server, run the following command in your site's directory:

    punch s

This will start a server listening to port 9009. If you want, you can provide an alternate port when starting the server.

    punch s 3000

Easy Publishing

A site created with Punch is just a collection of static files. You can publish it by just copying the files to a web host. Now, Punch makes publishing process even more simple for you. All you need to do is run a single command (punch p) to publish your site. Punch can handle publishing to Amazon S3 or any remote server that supports SFTP protocol.

To learn more details about publishing, check this wiki article.

New Homepage & Boilerplate

Punch now boasts a beautiful homepage, thanks to the great work done by Collin Olan. I hope the new homepage will add a positive vibe to the project.

Also, Collin created Backfist, a boilerplate to create sites with Punch. It's a good base for you to get started and learn the conventions of Punch.

Guide Wiki

One of the common problems among most open source projects is outdated and scattered documentation (in blog posts, issue tickets, README, Wiki and even word of mouth). I thought it would be better if we set a convention early to avoid this happening to Punch.

So I created Punch wiki, which will serve as the official guide for the project.

Future

I'm really happy the way project progrssed in its first two months. I've plans to make this even more awesome in the days to come. First step would be to get more people to try Punch and convince them to make it part of their toolbox.

So if you are already using Punch, spread the good about it to others. If you see any issues and improvements, please feel free to report and help to get them fixed.

Remember, behind every successful open-source project is a great community!

]]>
Don't Fret Over Client-Side Rendering http://laktek.com/2012/05/31/dont-fret-over-client-side-rendering http://laktek.com/2012/05/31/dont-fret-over-client-side-rendering/#comments Wed, 30 May 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/05/31/dont-fret-over-client-side-rendering Twitter is ditching client-side rendering and moving back to server-side. Does this mean client-side rendering is bad? Should we also move back to server-side? Well, it's insane to have FUD on client-side rendering just because it didn't work for Twitter (remember Rails can't scale?). Rather, make it a lesson on how to use client-side rendering sensibly.

"In our fully client-side architecture, you don’t see anything until our JavaScript is downloaded and executed."
- Twitter Engineering Blog

In an essence, this was the problem. When serving a web request, it's important to get the most essential information quickly before users' eyes. Best way to achieve this would be to identify and render the essential blocks in the server itself. You can then delegate the rest to be rendered on the client.

We take this approach for CurdBee. Screenshot below shows how CurdBee's Home screen is displayed on a slow connection or when JavaScript is disabled.

screenshot of CurcBee home screen

On most instances, users come to Home screen only to proceed to another action such as create invoice or add time entry. By rendering navigation blocks on server, we allow them to do this action immediately. Since remaining blocks are also rendered progressively, users can start interacting without having to wait till everything is loaded.

We use a very simple client-side rendering logic. Templates and data(JSON) are fetched and rendered, only when they are actually needed on screen. There's no eager loading involved. This helps us to keep the bootstrapping JavaScript code to minimal, which loads and parses faster. Also, rendering happens asynchronously allowing us to progressively render the blocks. We use appropriate caching for the templates and data to make the responses fast.

So if used properly, client-side rendering can actually improve the overall user experience. Rather than trying to do everything in server-side or client-side, figure out how you can have the best in the both worlds.

]]>
Winning People with Code http://laktek.com/2012/05/23/winning-people-with-code http://laktek.com/2012/05/23/winning-people-with-code/#comments Tue, 22 May 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/05/23/winning-people-with-code Last Monday, myself and Laknath, met with Jervis and his lovely wife Himashini, for what it turned out to be the first unofficial meetup of Punch users.

Jervis is an electrical engineer from Australia and he runs his own company, where he teaches Python to other electrical engineers (well, it was fascinating to know that our power plants are controlled from Python scripts). Jervis has been one of the early adopters of Punch and he contributes to make it better. Couple of weeks back, he emailed me saying he's visiting Sri Lanka and would love to meet me for a coffee. I was flattered by that compliment itself.

So on last Monday, Jervis actually came to Sri Lanka with his wife and we met in the evening at Tintagel. It was amazing to listen Jervis enthusiastically explaining how he uses Punch to host his video tutorials and how it simplified his workflow. We discussed about the future of the project, what can be improved and what other interesting tools we can integrate with Punch. Our chat didn't only stick to Punch. We moved on to many other interesting topics such as hacker culture, how to learn human languages quickly (thanks to his wife, Jervis can speak Sinhala quite well) and even how to make ice-cream at home (meanwhile, our evening got extended from a coffee to a wonderful dinner). Jervis and Himashini were really nice people and we shared a lot in common. I'm glad a project like Punch, helped us to be friends.

This is the beauty of writing and releasing code. Many still think of coding as an isolated battle with a dumb machine. Very few would believe, if you say you can win people with code. But it's true.

Almost all the projects I open-sourced, were originally written just to scratch my itch. I never thought that others will find them useful or contribute back to make them better. But the messages I receive via GitHub says otherwise. I realized that with every piece of code we write has the potential to make some kind of an impact on others' lives. I see code as a powerful form of expression, which can bring people around the world together.

If you love your code, share it. The feeling you will get when others fall in love with your work is just amazing.

]]>
Rapidly Prototyping Web Applications with Punch http://laktek.com/2012/05/17/rapidly-prototyping-web-applications-using-punch http://laktek.com/2012/05/17/rapidly-prototyping-web-applications-using-punch/#comments Wed, 16 May 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/05/17/rapidly-prototyping-web-applications-using-punch Process of prototyping a web application is two-fold. You will have to focus on the front-end, as well as the underlying data layer which powers it. Representation of the underlying data layer is what we usually call as the API. Front-end should know the types of data exposed by the API and also, API should know what data it should expose to the front-end. So it is essential to prototype these two layers hand in hand.

However, it's extremely difficult peel out and handle these two layers seamlessly with existing prototyping tools. After trying out different prototyping techniques, I finally settled with a simple workflow which is based on Punch.

In this post, I'll explain how to use this workflow to prototype a web application.

Start with the Front-end

First of all, we should create a new project for the prototyping task. It can be done by running:

  > punch setup simple_issues

I thought it would be cool to create a prototype for a simple issue tacker similar to GitHub Issues, hence the name simple issues. Inside the simple_issues directory you will find another 2 directories namely - contents & templates and a config file.

Let's start off by prototyping the interface to show a list of issues. We shall create a file named issues.html in the templates directory. Personally, I used a boilerplate code based on HTML5 boilerplate and Twitter Bootstrap to build the prototype. Twitter Bootstrap is great for this kind of quick prototyping tasks. It's trivial to select a layout, add visual components and even assign basic interactivity to them. It allows us to focus on the overall arrangement, rather than spending time tinkering minor details.

After adding all essential UI blocks to the prototype, we can proceed to check the result on the browser. For this, we need to start Punch's development server (this is actually a new feature introduced in Punch, so make sure you have the latest version installed).

  > cd simple_issues 
  > punch server

Punch server will start on port 9009 by default. You can visit http://localhost:9009/issues to view the result.

Here's the initial prototype I came up with.

Extract the API

As I said before, front-end interface only covers part of the prototyping process. We need to figure out the API as well. For this, we have to look what are the real data needed to make the front-end meaningful.

In this example, list of issues are our main concern. Guided by the front-end, we can come up with a basic representation of the API response for a list of issues. Note that we use JSON as the data format for our API.

  {
    "issues": [

      {
          "id": 8
        , "permalink": "http://localhost:9009/issue"
        , "title": "Any way to get Punch on windows?"
        , "user": {
            "name": "sc0ttwad3" 
          }
        , "state": "closed"
      }
  }

We save our API prototypes in the contents directory using the same name as its front-end prototype (in this case issues.json).

In the original front-end prototype we had the list of issues hard-coded as a table:

  <table class="table table-striped">
    <thead>
      <tr>
        <th>#</th>
        <th>Description</th>
        <th>Reported By</th>
        <th>State</th>
      </tr>
    </thead>

    <tbody>

      <tr>
        <td>8</td>
        <td><a href="#">Any way to get Punch on windows?</a></td>
        <td>sc0ttwad3</td>
        <td>Closed</td>
      </tr>

    </tbody>
  </table>

We shall replace those table rows with a Mustache template, so that we can hook the API prototype we just created.

  {{#issues}}
  <tr>
    <td>{{id}}</td>
    <td><a href="{{permalink}}">{{title}}</a></td>
    <td>{{user.name}}</td>
    <td><span class="label">{{state}}</span></td>
  </tr>
  {{/issues}}
  {{^issues}}
  <tr>
    <td colspan="4">Hooray! You have no issues.</td>
  </tr>
  {{/issues}}

Remember to save the protoype page again with the name "issues.mustache". Otherwise Punch will not know it needs to be generated.

Refine and Repeat

Now you can try changing the issues.json file to see how it reflects on the front-end prototype. You can refine the front-end until you're satisfied with the representation of data. Also, you can tweak the API to suit to the requirements of the front-end.

Since Punch's development server generates the site on each request, you can check these changes immediately. Check the following demo to understand the full effect:

Reuse Parts

Another benefit of using Punch for prototyping is you can easily reuse the parts that are repeated in prototypes. For example, if we want to prototype the page for an individual issue, we can reuse the header and footer sections we created earlier in the issues page. Move the repeatable block into a new file and give it a name starting with an underscore (eg. _header.mustache). Then Punch will treat it as a partial.

Here's the sample template for a single issue, which includes header and footer as partials:


  {{> header }}

  <div class="row-fluid">

    {{#issue}}
    <div class="span12">
      <h2>{{title}}</h2>
      <span>Reported by {{user.name}} on {{created_on}}</span>
      <p>{{{body}}}<p>
    </div>
    {{/issue}}

    <hr/>

    {{#comments}}
    <div class="comment">
      <span>{{user.name}} said ({{created_on}}):</span>
      <p>{{body}}</p>
    </div>
    {{/comments}}

  </div>

  {{> footer }}

Moving on to Implementation

Once you are done with prototyping of the front-end and API for a feature, you can start the implementation of it. Designers can use the front-end prototype as the base for the actual template implementation. Since Mustache templates doesn't embed any logic, its definition can be easily translated in to any other templating language you prefer (such as ERB, Jade & etc).

Similarly, the developers can use the prototyped API responses, as the expectations of the actual API implementation. Prototyping this way will help to reduce lot of impedance mismatches that normally arise during the integration of front-end and API.

]]>
Create Quick HTML5 Presentations with Punch http://laktek.com/2012/04/27/create-quick-html5-presentations-with-punch http://laktek.com/2012/04/27/create-quick-html5-presentations-with-punch/#comments Thu, 26 Apr 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/04/27/create-quick-html5-presentations-with-punch HTML5 presentations are cool and convenient. They just work in the browser and thanks to CSS3 & JavaScript they can be made to look even better than the traditional slides. It's easy to link to individual slides. Also, demos (if you're showing something related to web technologies) can done on the slides itself. There are already couple of good frameworks such as Google's HTML5Slides, deck.js and impress.js to create HTML5 presentations.

However, in all above frameworks, you have to define the slides as HTML blocks (usually wrapped inside a div or a section). It's fairly trivial; but you may feel editing the slides gets messy and verbose when you have dozens or more slides. Especially, if you are trying to put together a presentation in a rush (which you should try to avoid).

I felt the same while creating the presentation on Punch, which I recently presented at RefreshColombo. Then I realized Punch itself can be used to make the process of creating HTML5 presentations quick and painless.

This is what I did. Rather than adding the slides in to the HTML page, I created them in a separate JSON file. Here's how it was structured.

  {
    "slides": [
      {
      "slogan": "Main Title of Presentation"
      },

      {
        "slogan": "Title of the Slide"
      , "primary_text": "A summary or sub-title"
      },

      {
        "slide_title": "Title of the Slide"
      , "primary_text": "This is a slide with just text. This is a slide with just text."
      },

      {
        "slide_title": "Title of the Slide"
      , "image": "cat_picture.jpg"
      , "footnote": "Source: http://www.flickr.com/photos/splityarn/2363974905/"
      }
    ]
  }

Then I saved the HTML page for slides as a Mustache template. There I added a section for slides, which can be rendered iteratively based on different content (as defined in the above JSON). Here's the mustache portion of the template which would be rendered:


  ...

  {{#slides}}
  <article class="{{class}}" >

    {{#slogan}}
      <h2>{{{slogan}}}</h2>
    {{/slogan}}

    {{#slide_title}}
      <h3>{{{slide_title}}}</h3>
    {{/slide_title}}

    {{#primary_text}}
      <p class="primary">{{{primary_text}}}</p>
    {{/primary_text}}

    {{#image}}
      <img src="images/{{image}}" class="centered"/>
    {{/image}}

    {{#code}}
      <pre>{{code}}</pre>
    {{/code}}

    {{#list}}
      <ul class="build">
        <li>{{{.}}}</li>
      </ul>
    {{/list}}

    {{#secondary_text}}
      <p class="secondary">{{{secondary_text}}}</p>
    {{/secondary_text}}

    {{#footnote}}
      <p class="source">{{{footnote}}}</p>
    {{/footnote}}

  </article>
  {{/slides}}

  ...

Finally, I ran punch command to generate the following HTML5 Presentation - http://laktek.com/presentations/punch/slides.html

I have created a simple bootstrapper, by extracting the common slide formats from my presentation, which you can use straight away. I used a slightly modified version of Google's HTMl5Slides. If you are happy with the default styles, you can just edit the contents/slides.json file and replace it with your slides. When you are done, run the punch command in the top-most directory to generate the HTML output.

You can edit the default template (templates/slides.mustache) to define any additional blocks to use in the slides or even change the template entirely to use a different presentation framework such as deck.js or impress.js.

The source is available in GitHub. Fork it and use!

]]>
Punch - A Fun and Easy Way to Build Modern Websites http://laktek.com/2012/04/19/punch-a-fun-and-easy-way-to-build-modern-websites http://laktek.com/2012/04/19/punch-a-fun-and-easy-way-to-build-modern-websites/#comments Wed, 18 Apr 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/04/19/punch-a-fun-and-easy-way-to-build-modern-websites Few months ago, I switched this blog from Wordpress to Jekyll. I love how Jekyll allows me to prepare everything locally and simply publish when it's ready. There's no server-side logic involved, which means I can host the whole blog in a S3 bucket. Also, there's no more worries on mundane issues like security, performance or database corruptions.

I wanted to have this freedom and control in any site that I would create and manage. Actually, most websites can be thought as static sites. They may contain few bits and pieces that needs to be rendered dynamically. The rise of modern browsers means we can pass this concern to the client-side.

However, websites have different requirements from a blog. It contains different pages carrying unique presentation along with few reusable blocks (for example, take a look at different pages in 37Signals site ). A page will not contain just a title and block of text, but composed of different types of content such as lists, images, videos, slides and maps. Also, a modern website often needs to be A/B tested and translated in to other languages. Even though, we use can use blog engines like Jekyll (or Wordpress) I felt there's still a void for a tool tailored to create and manage websites.

That's why I created Punch.

What is Punch?

Aim of Punch is to help anyone to build (and maintain) modern websites using only HTML, CSS and JavaScript. Punch is largely inspired from Jekyll, but it's not a blog engine. It's intuitive to use and easy to extend.

Punch is written with Node.js and will work with your local file system. Currently, Punch renders template files written in Mustache. It expects content to be available in JSON format. Punch can also parse content in Markdown format.

To generate a site, templates and contents should be available in two directories. For each Mustache template found in the templates directory, Punch will look for relevant content in the contents directory. Contents should be presented in a JSON file having the same name as the template. Alternatively, you can create a directory with the same name as template to store multiple JSON and Markdown files. Punch will save the rendered file in the output directory. Any other files(HTML, images, JS, CSS, etc.) and directories under templates will be copied to the output directory without any modification.

How to Use

Here's a quick screencast on how to use Punch.

For more details on installation and usage, please refer the README and the User Guide.

Easy Client-side Rendering

As I mentioned earlier, we can render any dynamic blocks in a site on the client-side. Since Punch is written in JavaScript, we can easily use its renderer on client-side as well. You can see this feature is used in the Punch's homepage to render the "GitHub Watchers" block.

To use the renderer, you must include Mustache.js and Punch's Mustache renderer in the HTML page.

  <script type="text/javascript" src="assets/mustache.js"></script>
  <script type="text/javascript" src="node_modules/punch/lib/renderers/mustache.js"></script>

Then you have to initate a new instance of MustacheRenderer and provide the template, content and any partials you need to render. Since Punch's renderer works asynchronously it can be used reliably in contexts which involves AJAX content fetching. You can see below, how I pass the JSON response from the GitHub API and template fragment to the renderer to render the GitHub Watchers block. We must also provide a callback function to execute after rendering is done.

  // Load and Render GitHub followers
  (function(){
    if($("#github_followers").length > 0){
      var renderer = new MustacheRenderer();

      renderer.afterRender = function(output){
        $("#github_followers").html(output);
      };

      renderer.setTemplate('{{#followers}} \
                            <a href="http://github.com/{{login}}" rel="nofollow"><img size="16" src="{{avatar_url}}" title="{{login}}" alt="{{login}}"/></a> \
                            {{/followers}} \
                          ');
      renderer.setPartials({});

      $.getJSON("https://api.github.com/repos/laktek/punch/watchers", function(data){
        renderer.setContent({"followers": data});
      });
    }
  })();

Start Playing!

I have been already using Punch to create few personal sites and just finished porting CurdBee's public site to Punch. Every time I used Punch it has been a pleasant experience and I feel I have the freedom and control to create sites the way I want. I hope most of you will start to feel the same with Punch. Also, I have plans to support features such as browser-based content editing and easy publishing options, which could make Punch more awesome.

Go install it, build nice sites, spread the word and send me pull requests!!

]]>
A Few cURL Tips for Daily Use http://laktek.com/2012/03/12/curl-tips-for-daily-use http://laktek.com/2012/03/12/curl-tips-for-daily-use/#comments Sun, 11 Mar 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/03/12/curl-tips-for-daily-use Though I knew cURL as a powerful tool; so far I never made an attempt to get familiar with it. Most of the time, I would just wade through its man pages to find a way get my stuff done. Recently I found myself make use of it for many of my daily tasks. By those excess usage, couple of recurring patterns emerged.

If you are already familiar with cURL, you may not find anything interesting or new here (but feel free to point out any improvements or other useful tips in comments).

Resume failed downloads

cURL has this handy option (-C or --continue-at) to set a transfer offset, which helps to resume failed downloads. On most cases, setting the offset as a single dash, will let cURL to decide how to resume the download.

  curl -C - -L -O http://download.microsoft.com/download/B/7/2/B72085AE-0F04-4C6F-9182-BF1EE90F5273/Windows_7_IE9.part03.rar

It's a shame that I came to know about this very recently. I would now be cursing lot less at my ISP.

Fetch request body from a file

Nowadays, most web service APIs demand request bodies to be formatted as JSON. Manually entering a JSON formatted string in command-line is not a very convenient option. Better way to do it would be to prepare the request body in a file and provide it to cURL.

Here's an example of creating a gist, providing the payload from a JSON file.

  curl -d @input.json https://api.github.com/gists

Start the data parameter with a @ to tell cURL, it should fetch the file in given path.

Mimic AJAX requests

Sometimes I need create endpoints in web apps, that produces alternate responses when accessed via AJAX (eg. not rendering the layout). Testing them directly in browser is not much viable as it require bootstrapping code. Instead, we can mimic AJAX requests from cURL by providing X-Requested-With header.

  curl -H "X-Requested-With: XMLHttpRequest" https://example.com/path

Store and Use Cookies

Another similar need is to test the behavior of cookies. Especially, when you want to alter a response depending on a cookie value.

You can use cURL to download the response cookies to a file and then use them on the subsequent requests. You can inspect the cookie file and even alter it to test the desired behavior.

  curl -L -c cookies.txt http://example.com 
  curl -L -b cookies.txt http://example.com

View a web page as GoogleBot

When I was running this blog with Wordpress, Google marked it as a site infected with malware. Panicked, I visited the site and checked the source. I couldn't see anything suspicious. Later I discovered, the malware is only injected to the site only when it is accessed by the GoogleBot. So how do you see a site's output as GoogleBot?

cURL's option (-A or --user-agent) to change the user-agent of a request comes handy on such instances. Here's how you can impersonate GoogleBot:

  curl -L -A "Googlebot/2.1 (+http://www.google.com/bot.html)" http://example.com

Peep into others' infrastructure

This is not exactly a cURL feature, but comes in handy when I want to find out what others' use to power their apps/sites.

  curl -s -L -I http://laktek.com | grep Server

Yes, now you know this blog runs on AmazonS3 :).

]]>
Learning Go - Functions http://laktek.com/2012/02/23/learning-go-functions http://laktek.com/2012/02/23/learning-go-functions/#comments Wed, 22 Feb 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/02/23/learning-go-functions Functions are first-class citizens in Go. In the very first post of this series, we learnt that Go functions can return multiple values. Apart from multiple return values, there are several other interesting features in Go functions that's worth exploring.

Higher-order Functions

In Go, a function can take another function as an argument and also define a function as a return type. This feature, which is known as higher-order functions in functional programming, is a great way to define dynamic and reusable behavior.

For example, Map function from the strings package takes a mapping function as its first argument. We can come up with a simple cipher algorithm by passing a function to Map.

  output := strings.Map(func(c int) int {
        alphabet := "abcdefghijklmnopqrstuvwxyz"
        cryptabet := "THEQUICKBROWNFXJMPSVLAZYDG"

        return utf8.NewString(cryptabet).At(strings.Index(alphabet, string(c)))
    }, "hello")

What if you want to encode your message with a different cryptabet? How about something fancy like Sherlock Holmes' Little Dancing Men? Current implementation doesn't have that flexibility, but let's try to extend our code.

  func CipherGenerator(cryptabet string) func(int) int {
    return func(c int) int {

      alphabet := "abcdefghijklmnopqrstuvwxyz"
      encoded_cryptabet := utf8.NewString(cryptabet)

      return encoded_cryptabet.At(strings.Index(alphabet, string(c)))

    }
  }

  func main() {
    fmt.Printf(strings.Map(CipherGenerator("☺☻✌✍✎✉☀☃☁☂★☆☮☯〠☎☏♕❤♣☑☒✓✗¢€"), "hello"))
  }

So we created a function called CipherGenerator. It can accept any unicode string as a cryptabet and return a function of type func(int) int which is assignable as a mapping function to strings.Map.

User-Defined Function types

When you define a function to return another function, its signature can get little too complex. In the previous example, the function signature was func CipherGenerator(cryptabet string) func(int) int. This not easy to comprehend at a glance. We can make this more readable by declaring a named type for the returning function.

  type MappingFunction func(int) int

Now we can declare the CipherGenerator function like this:

  func CipherGenerator(cryptabet string) MappingFunction {
    // ...
  }

Since the underlying type matches, it is still assignable to the strings.Map.

Closure

In the function literal that's returned from the CipherGenerator, we are referencing to the variable cryptabet. However, cryptabet is defined only in the scope of CipherGenerator. Then how does returned function has access to cryptabet each time it runs?

This property is known as Closure>). In simple terms, a function will inherit the variables from the scope it was declared.

Applying the same principle, we can also move the variable alphabet from the scope of returning function to the scope of CipherGenerator.

  func CipherGenerator(cryptabet string) MappingFunction {
    alphabet := "abcdefghijklmnopqrstuvwxyz"

    return func(c int) int {

      encoded_cryptabet := utf8.NewString(cryptabet)

      return encoded_cryptabet.At(strings.Index(alphabet, string(c)))

    }
  }

Deferred Calls

In Go functions you can define a special statement called defer. Defer statements invoke function or method calls immediately before the surrounding function (function that contains the defer statement) returns.

Defer statements are most commonly used for cleanup jobs. Its execution is ensured whatever return path your function takes.

  func Read(reader io.ReadCloser) string {
    defer reader.Close()

    // ...
  }

Defer statement evaluates the parameters to the function call at the time it's executed. However, the function call isn't invoked until the surrounding function returns. Check the example below.

  func FavoriteFruit() (fruit string) {
    fruit = "Apple"
    defer func(v string) {
      fmt.Printf(v)
    }(fruit)

    fruit = "Orange"

    return
  }

Though the value of fruit is later changed to Orange, defer function call will still print Apple. This is because it was the value fruit held, when the defer statement was executed.

Since we can use function literals in defer statements, closure property applies to them too. Instead of passing the variable fruit in the defer function call, we can directly access it from the deferred function.

  func FavoriteFruit() (fruit string) {
    fruit = "Apple"
    defer func() {
      fruit = "Orange"
    }()

    return
  }

What is more interesting here is deferred call actually modifies the return value of the surrounding function. This is because the deferred call is invoked before return values are passed to the caller.

You can have mulitple defer statements within a function.

  func f() (output string) {
    defer fmt.Printf("first deferred call executed.") 
    defer fmt.Printf("second deferred call executed.") 

    return "Function returned"
  }

  fmt.Printf(f())

  // Output:
  // second deferred call executed.
  // first deferred call executed.
  // Function returned

As you can see from the output, defer statements are executed in the Last-In-First-Out(LIFO) order.

Variadic Functions

If you've noticed, you can invoke fmt.Printf with variable number of arugments. On one instance, you may simply call it as fmt.Printf("Hello") and in another as fmt.Printf("Result of %v plus %v is %v", num1, num2, (num1 + num2)). Since you have to define the parameters when declaring a function signature, how this is possible?

You can define the last parameter to a function with a type prefixed with .... Then it means the function can take zero or more arguments of the given type. Such functions are known as variadic functions.

Here's a simple variadic function to calculate the average of a series of integers.

  func Avg(values ...int) float64 {
    sum := 0.0
    for _, i := range values {
      sum += float64(i)
    }
    return sum / float64(len(values))
  }

  Avg(1, 2, 3, 4, 5, 6) //3.5

In a variadic function, arguments to the last parameter is collected to a slice of the given type. We can directly pass an already composed slice instead of individual values to the variadic function. However, argument should be appended with a ....

  values := []int{1, 2, 3, 4, 5, 6}
  Avg(values...) //3.5

By using blank interface type with variadic functions, you can come up with very flexible functions similar to fmt.Printf.

Further Reference

Go Coderwalk on First-Class Functions. http://golang.org/doc/codewalk/functions/

]]>
Learning Go - Interfaces & Reflections http://laktek.com/2012/02/13/learning-go-interfaces-reflections http://laktek.com/2012/02/13/learning-go-interfaces-reflections/#comments Sun, 12 Feb 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/02/13/learning-go-interfaces-reflections In the post about Go Types, I briefly mentioned that Go provides a way to do run-time type inference using interfaces and reflection package. In this post, we are going to explore these concepts in depth.

What is an Interface?

Imagine you stop a random cab expecting to go from A to B. You wouldn't care much about the nationality of the driver, how long he's been in this profession, to whom he voted in the last election or even whether he is a robot; as long as he can take you from A to B.

This is how the interfaces works in Go. Instead of expecting for a value of particular type, you are willing to accept a value from any type that implements the methods you want.

We can represent our Cab Driver analogy in Go like this (assuming there are human and robot cab drivers):


  type Human struct {
  }

  func (p *Human) Drive(from Location, to Location){
    //implements the drive method
  }

  type Robot struct {
  }

  func (p *Robot) Drive(from Location, to Location){
    //implements the drive method
  }

  type Driver interface {
    Drive(from Location, to Location) 
  }

  func TakeRide(from Location, to Location, driver Driver){
    driver.Drive(from, to)
  }

  func main(){
    var A Location
    var B Location

    var random_human_driver *Human = new(Human)
    var random_robot_driver *Robot = new(Robot)

    //...

    TakeRide(A, B, random_human_driver)
    TakeRide(A, B, random_robot_driver)
  }

Here we defined a type called Driver which is an interface type. An interface type contains a set of methods. Type supporting the interface, should fully implement this method set.

In this instance, pointer types of both Human and Robot (*Human and *Robot) implements the Drive method. Hence, they satisfies the Driver interface. So when calling the TakeRide function, which expects a Driver interface as an argument, we can pass a pointer to either Human or Robot types.

You can assign any value to an interface, if its type implements all methods defined by the interface. This loose coupling allows us to implement new types to support existing interfaces, as well as create new interfaces to work with existing types.

I recommend you to read the section on Interfaces in Effective Go, if you haven't already. It provides more elaborative examples on the usage of interfaces.

Encapsulation

Another benefit of Interfaces are they can be used for encapsulation. If a type is only implements the methods of a given interface, it's ok to export only the interface without the underlying type. Obviously, this is helpful in maintaining a cleaner and concise API.

In our previous Cab Driver example, both Human and Robot types doesn't have any other methods apart from the Drive method. Which means we can export only the Driver interface and keep human and robot types encapsulated to the package (hm...a paranoid cab company which doesn't reveal the true identities of the drivers!).


  // ...
  // Type & method declarations were skipped

  func TakeRide(from Location, to Location, driver Driver){
    driver.Drive(from, to)
  }

  func NewDriver() Driver {
    // this constructor will assign 
    // a random value of type *human or *robot
    // to Driver interface.

    return
  }

  func main(){
    var A Location
    var B Location

    var random_driver Driver = NewDriver() 

    //...

    TakeRide(A, B, random_driver)
  }

We have introduced a new function called NewDriver() which will return a Driver interface. Value of the Driver interface could be either *human or *robot.

Empty Interface - interface{}

It's possible to define an interface without any methods. Such an interface is known as an empty interface, and it's denoted by interface{}. Since there are no methods, any type will satisfy this interface.

I'm sure most of you are familiar with the fmt.Printf function; which accepts variable number of arguments of different types and produce a formatted output. If we take a look at its definition, it accepts variable number of empty interface(interface{}) values. This means, Printf is using a mechanism based on empty interfaces to infer the types of values at run-time. If you read through the next sections, you will get a clue how it does that.

Type Assertion

In Go, there's a special expression, which let's you assert the type of the value interface holds. This is known as Type Assertion.

In our Cab Driver example, we can use type assertion to verify whether the given driver is a human.

  var random_driver Driver = NewDriver() 
  v, ok := random_driver.(*human)

If the NewDriver() method returns an interface with the value of type *human; v will be assigned with that value and value of ok will be true. If NewDriver() returns a value of *robot type; v will be set to nil and value of ok will be false.

With type assertion it is possible to convert one interface value to another interface value too. For the purpose of this example; let's assume there's another interface called Runner which defines a Run method and our *human type also implements the Run method.

  var random_driver Driver = NewDriver() 
  runner, ok := random_driver.(Runner)

Now when the Driver interface is contained with a value of *human; it is possible for us to convert the same value to be used with the Runner interface too.

Type Switches

Using type assertions, Go offers a way to do different actions based on a value's type. Here's the example given in Gospec for type switching:

  switch i := x.(type) {
  case nil:
    printString("x is nil")
  case int:
    printInt(i)  // i is an int
  case float64:
    printFloat64(i)  // i is a float64
  case func(int) float64:
    printFunction(i)  // i is a function
  case bool, string:
    printString("type is bool or string")  // i is an interface{}
  default:
    printString("don't know the type")
  }

Type switching uses a special form of type assertion, with the keyword type. Note that this notation is not valid outside of type switching context.

Reflection Package

Combining the power of empty interfaces and type assertions, Go provides Reflection package which allows more robust operations on types and values during the run-time.

Basically, reflection package gives us the ability to inspect the type and value of any variable in a Go program.

  import "reflect"

  type Mystring string

  var x Mystring = Mystring("awesome")
  fmt.Println("type:", reflect.TypeOf(x))
  fmt.Println("value:", reflect.ValueOf(x))

This gives the output as:

  type: main.Mystring
  value: awesome

However, this is just the tip of the iceberg. There are lot more powerful stuff possible with the Reflection package. I'll leave you with the "Laws of Reflection" blog post and Godoc of the relection package. Hope its capabilities will fascinate you.

Further Reading

]]>
Stack on Go - A Wrapper for Stack Exchange API http://laktek.com/2012/02/06/stack-on-go http://laktek.com/2012/02/06/stack-on-go/#comments Sun, 05 Feb 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/02/06/stack-on-go Regular readers of this blog would know I've been spending my free time to learn Go. Today, I present you the first fruit of those learning experiences. Stack on Go is a wrapper library written in Go for Stack Exchange API.

When I first stumbled upon the version 2.0 of Stack Exchange API, I felt it as one of the best API designs I've ever seen. So I decided to write a wrapper for it in Go, which was a good way to learn both Golang and modern API design techniques.

Stack on Go fully implements the Stack Exchange API 2.0 and it is compatible with the Go runtime at Google AppEngine. I hope this could be a good platform for some interesting apps such as notifiers, aggregators and stat analyzers based on Stack Exchange API (well, the possibilities are endless with such a rich dataset).

Also, bear in mind Stack Exchange is running a competition and offering an iPad2 for the most awesome application submitted (that'd also give Stack on Go a great chance to become the best library ;) ).

So If you always wanted to learn Go but never got the start, hope this would be a great motivator.

Installation

Let's have a look how to get started with Stack on Go.

Stack on Go is fully compatible with Go1.

To install the package, run:

  go get github.com/laktek/Stack-on-Go

Basic Usage

Once installed, you can use Stack on Go by importing it in your source.

  import "github.com/laktek/Stack-on-Go/stackongo"

By default, package will be named as stackongo. If you want, you can give an alternate name at the import.

Stack Exchange API contains global and site specific methods. Global methods can be directly called like this:

  sites, err := stackongo.AllSites(params)

Before calling site specific methods, you need to create a new session. A site identifier should be passed as a string (usually, it's the domain of the site).

  session := stackongo.NewSession("stackoverflow")

Then call the methods in scope of the created session.

  info, err := session.Info()

Most methods accept a map of parameters. There's a special Params type that you can use to create a parameter map.

  //set the params
  params := make(stackongo.Params)
  params.Add("filter", "total")
  params.AddVectorized("tagged", []string("go", "ruby", "java"))

  questions, err := session.AllQuestions(params)

If you prefer, you can pass your parameters directly in a map[string]string literal:

  questions, err := session.AllQuestions(map[string]string{"filter": "total", "tagged": "go;ruby;java"})

Most methods returns a struct containing a collection of items and meta information (more details available in StackExchange docs ). You can traverse through the results to create an output:

  for _, question := range questions.Items {
        fmt.Printf("%v\n", question.Title)
        fmt.Printf("Asked By: %v on %v\n", question.Owner.Display_name, time.SecondsToUTC(question.Creation_date))
        fmt.Printf("Link: %v\n\n", question.Link)
    }

You can use the returned meta information to make run-time decisions. For example, you can check whether there are more results and load them progressively.

  if questions.Has_more {
    params.Page(page + 1)
    questions, err = session.AllQuestions(params)
    }

Authentication

Stack Exchange follows the OAuth 2.0 workflow for user authentication. Stack on Go includes two helper functions tailored for authentication offered by the Stack Exchange API.

AuthURL returns you a URL to redirect the user for authentication and ObtainAcessToken should be called from the handler of redirected URI to obtain the access token.

Check the following code sample, which explains the authentication flow:

  func init() {
    http.HandleFunc("/", authorize)
    http.HandleFunc("/profile", profile)
  }

  func authorize(w http.ResponseWriter, r *http.Request) {
    auth_url := stackongo.AuthURL(client_id, "http://myapp.com/profile", map[string]string{"scope": "read_inbox"})

    header := w.Header()
    header.Add("Location", auth_url)
    w.WriteHeader(302)
  }

  func profile(w http.ResponseWriter, r *http.Request) {
    code := r.URL.Query().Get("code")
    access_token, err := stackongo.ObtainAccessToken(client_id, client_secret, code, "http://myapp.com/profile")

    if err != nil {
      fmt.Fprintf(w, "%v", err.String())
    } else {
      //get authenticated user
      session := stackongo.NewSession("stackoverflow")
      user, err := session.AuthenticatedUser(map[string]string{}, map[string]string{"key": client_key, "access_token": access_token["access_token"]})

      // do more with the authenticated user
    }

  }

Using with AppEngine

If you plan to deploy your app on Google AppEngine, remember to do a one slight modification in your code. Since AppEngine has a special package to fetch external URLs you have to set it as the transport method for Stack on Go.

Here's how to do it:

  import (
    "github.com/laktek/Stack-on-Go/stackongo"
    "appengine/urlfetch"
  )

  func main(){
    c := appengine.NewContext(r)
    ut := &urlfetch.Transport{Context: c}

    stackongo.SetTransport(ut) //set urlfetch as the transport

        session := stackongo.NewSession("stackoverflow")
    info, err := session.Info()
  }

Under the Hood

If you wish to write wrappers for other web app APIs in Go, you might be interested in knowing the implementation details of Stack on Go.

Actually, the implementation is fairly straightforward. The following method is the essence of the whole library.

  func get(section string, params map[string]string, collection interface{}) (error os.Error) {
    client := &http.Client{Transport: getTransport()}

    response, error := client.Get(setupEndpoint(section, params).String())

    if error != nil {
      return
    }

    error = parseResponse(response, collection)

    return

  }

Every method call is routed to above function with the relevant struct, path and parameters provided. Using the path and parameters, we generate the endpoint URL. This is then called using the http.Client methods. Afterwards, the response and the provided struct interface is passed to a custom parser function. There the response body is read and parsed using the JSON.Unmarshall method. The JSON output is finally mapped to the provided struct via the interface. This is what the called method finally returns.

I used httptest, which is available in Go's standard packages to unit test the library. All API calls were proxied (using a custom Transport) to a dummy server which serves fake HTTP responses. This setup makes it easy to test both request and response expectations easily.

  func createDummyServer(handler func(w http.ResponseWriter, r *http.Request)) *httptest.Server {
    dummy_server := httptest.NewServer(http.HandlerFunc(handler))

    //change the host to use the test server
    SetTransport(&http.Transport{Proxy: func(*http.Request) (*url.URL, os.Error) { return url.Parse(dummy_server.URL) }})

    //turn off SSL
    UseSSL = false

    return dummy_server
  }

  func returnDummyResponseForPath(path string, dummy_response string, t *testing.T) *httptest.Server {
    //serve dummy responses
    dummy_data := []byte(dummy_response)

    return createDummyServer(func(w http.ResponseWriter, r *http.Request) {
      if r.URL.Path != path {
        t.Error("Path doesn't match")
      }
      w.Write(dummy_data)
    })
  }

For those who like to dig deeper the source code is available on GitHub. You can contact me if you need any help in using Stack on Go. Also, feel free to report any issues and improvements.

]]>
Learning Go - Types http://laktek.com/2012/01/27/learning-go-types http://laktek.com/2012/01/27/learning-go-types/#comments Thu, 26 Jan 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/01/27/learning-go-types One of the main reasons I embrace Golang is its simple and concise type system. It follows the principle of least surprise and as per Rob Pike these design choices were largely influenced by the prior experiences.

In this post, I will discuss some of the main concepts which are essential in understanding Golang's type system.

Pre-declared Types

Golang by default includes several pre-declared boolean, numeric and string types. These pre-declared types are used to construct other composite types, such as array, struct, pointer, slice, map and channel.

Named vs Unnamed Type

A type can be represented with an identifier (called type name) or from a composition of previously declared types (called type literal). In Golang, these two forms are known as named and unnamed types respectively.

Named types can have their own method sets. As I explained in a previous post, methods are also a form of functions, which you can specify a receiver.

  type Map map[string]string

  //this is valid
  func (m Map) Set(key string, value string){
    m[key] = value 
  }

  //this is invalid
  func (m map[string]string) Set(key string, value string){
    m[key] = value 
  }

You can define a method with named type Map as the receiver; but if you try to define a method with unnamed type map[string]string as the receiver it's invalid.

An important thing to remember is pre-declared types are also named types. So int is a named type, but *int or []int is not.

Underlying Type

Every type do have an underlying type. Pre-declared types and type literals refers to itself as the underlying type. When declaring a new type, you have to provide an existing type. The new type will have the same underlying type as the existing type.

Let's see an example:

  type Map map[string]string
  type SpecialMap Map

Here the underlying type of map[string]string is itself, while underlying type of Map and SpecialMap is map[string]string.

Another important thing to note is the declared type will not inherit any method from the existing type or its underlying type. However, method set of an interface type and elements of composite type will remain unchanged. Idea here is if you define a new type, you would probably want to define a new method set for it as well.

Assignability

  type Mystring string
  var str string = "abc"
  var my_str MyString = str //gives a compile error

You can't assign str to my_str in the above case. That's because str and my_str are of different types. Basically, to assign a value to a variable, value's type should be identical to the variable's type. It is also possible to assign a value to a variable if their underlying types are identical and one of them is an unnamed type.

Let's try to understand this with a more elaborative example:

  package main

  import "fmt"

  type Person map[string]string
  type Job map[string]string

  func keys(m map[string]string) (keys []string) {
    for key, _ := range m {
      keys = append(keys, key)
    }

    return
  }

  func name(p Person) string {
    return p["first_name"] + " " + p["last_name"]
  }

  func main(){
    var person = Person{"first_name": "Rob", "last_name": "Pike"}
    var job = Job{"title": "Commander", "project": "Golang"}

    fmt.Printf("%v\n", name(person))
    fmt.Printf("%v", name(job)) //this gives a compile error

    fmt.Printf("%v\n", keys(person))
    fmt.Printf("%v\n", keys(job))
  }

Here both Person and Job has map[string]string as the underlying type. If you try to pass an instance of type Job, to name function it gives a compile error because it expects an argument of type Person. However, you will note that we can pass instances of both Person and Job types to keys function which expects an argument of unamed type map[string]string.

If you still find assignability of types confusing; I'd recommend you to read the explanations by Rob Pike in the following discussion.

Type Embedding

Previously, I mentioned when you declare a new type, it will not inherit the method set of the existing type. However, there's a way you can embed a method set of an existing type in a new type. This is possible by using the properties of annonymous field in a struct type. When you define a annonymous field inside a struct, all its fields and methods will be promoted to the defined struct type.

  package main

  type User struct {
    Id   int
    Name string
  }

  type Employee struct {
    User       //annonymous field
    Title      string
    Department string
  }

  func (u *User) SetName(name string) {
    u.Name = name
  }

  func main(){
    employee := new(Employee)
    employee.SetName("Jack")
  }

Here the fields and methods of User type get promoted to Employee, enabling us to call SetName method on an instance of Employee type.

Type Conversions

Basically, you can convert between a named typed and its underlying type. For example:

  type Mystring string

  var my_str Mystring = Mystring("awesome")
  var str string = string(my_str)

There are few rules to keep in mind when it comes to type conversions. Apart from conversions involving string types, all other conversions will only modify the type but not the representation of the value.

You can convert a string to a slice of integers or bytes and vice-versa.

  []byte("hellø")

  string([]byte{'h', 'e', 'l', 'l', '\xc3', '\xb8'})

More robust and complex run-time type manupilations are possible in Golang using the Interfaces and Relection Package. We'll see more about them in a future post.

]]>
Different flavors of JavaScript http://laktek.com/2012/01/19/different-flavors-of-javascript http://laktek.com/2012/01/19/different-flavors-of-javascript/#comments Wed, 18 Jan 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/01/19/different-flavors-of-javascript If you are programming with JavaScript, knowing about ES3, ES5 & Harmony specifications and their usage will be useful. Here's a plain & simple explanation of them for your easy understanding.

ECMAScript

If we look into the history of JavaScript, it was originated from a side project of Brendan Eich called "Mocha". In 1995, it was shipped with Netscape browser as "LiveScript" and it soon renamed as "JavaScript" (mainly from the influence of Sun Microsystems). Due to the quick popularity of JavaScript, Microsoft also decided to ship it with Internet Explorer in 1996. Microsoft's implementation had slight differences from the original, thus they aptly named it as "JScript".

As browser wars between Netscape and Microsoft fired up, Netscape soon pushed JavaScript to Ecma International for standardization. Ecma accepted the proposal and began the standardization under the ECMA-262 standard. As a compromise for all organizations involved in the standardization process, ECMA-262 baptized this scripting language as "ECMAScript".

Even though we still call it as JavaScript, the technically correct name is ECMAScript.

ES3

Over the years, Ecma has released different editions of ECMAScript standard. For the ease of use we call these standards as "ESx", where x refers to the edition. So the 3rd edition of ECMAScript is known as ES3. ES3 can be considered as the most widely adopted edition of ECMASCript.

The most outdated browser in mainstream (aka Disease) Internet Explorer 6 is compliant with ES3. Sadly, other common IE versions(7 & 8) are also only compatible with ES3. Early versions of most other browsers also supported ES3. This means all JavaScript features you commonly use are part of ES3. Most JavaScript libraries, frameworks, tutorials, best practices and books written in the past are based on these features standardized in ES3.

Source-to-source compilers such as CoffeeScript, which aims to run everywhere, compiles its code to be compatible with ES3.

If you are interested in reading the full ECMAScript 3rd edition specification, you can download it from here.

ES5

After years of split and conflict of interests ECMA's Technical Committee came to an agreement in 2008 to follow two paths for the future development of ECMAScript. One as an immediate plan to overcome the issues ES3 specification (then called as ES3.1). Another with a long term vision to evolve the language for the modern requirements. They also decided to kill ES4 specification, which was under development to support the above plans.

The ES3.1 edition was finally released as ES5 in 2009. Some of the notable features in this edition were Native JSON support, better Object creation and controlling options, utility Array functions and the introduction of strict mode to discourage the usage of unsafe and poorly implemented language features. You can read a detailed introduction about ES5 features in Opera blog.

Full support for ES5 in major browsers were introduced from the following versions - Firefox 4, Chrome 7, Internet Explorer 9 and Opera 11.60. Safari 5.1 (and mobile Safari) in iOS5 do support all of ES5 features except for Function.bind. Also, IE9 doesn't support the strict mode option. Juriy Zaytsev provides a comprehensive compatibility table of ES5 features, which I recommend you to bookmark.

So is it safe to use ES5 features in our JavaScript code? Answer to that largely depends on your user base. If majority of your users comes from Internet Explorer 6, 7 & 8, code with ES5 features will break for them. One way to solve this problem is to use ES5 shims for unsupported browsers. You may decide which shims to include depending on the features you use in your code. Also, if your code is already depending on a utility library such as Underscore.js, which also provides similar features to ES5 you may continue to use it. Most utility libraries will use the native implementation if available, before falling back to its own implementation.

If you are writing server-side JavaScript based on Node.js you can freely use ES5 features. Node.js is based on the V8 JavaScript engine, which is fully compatible with ES5. Another thing to consider is whether you should write your server-side JavaScript using CoffeeScript. If you are doing so, you are limiting your ability to use native ES5 features. As I mentioned earlier CoffeeScript compiles only to ES3 compatible JavaScript and has custom implementations for ES5 features such as bind. However, this is still an open issue with discussions, suggesting CoffeeScript may add the option to compile ES5 compatible code in future.

For the full reference of ES5, I recommend using the annotated HTML version done by Michael Smith - es5.github.com

ES.Next (Harmony)

The long-term plan for the ECMAScript in 2008 meeting, was code-named Harmony. Committee is still accepting proposals for this edition. Most probably, this will become the ES6, but given the past track-record of ECMA-262 ES.Next would be more suitable name until a release is made.

Currently planned features for Harmony sounds promising. Brendan Eich has shared some ideas for Harmony which seems to make the language more concise and poweful. Also, his presentation on Proxy Objects in Harmony sounds awesome.

SpiderMonkey and V8 JavaScript engines has already started implementing some of the Harmony related features, such as Proxies and WeakMaps. It would be still premature to use these features at the client-side (in Chromium browser you need to explicitly enable Harmony features via a special flag). Node.js 0.7, will ship with v8 version 3.8 giving you the opportunity to tase some of the Harmony features in server-side.

]]>
Learning Go - Constants & Iota http://laktek.com/2012/01/12/learning-go-constants-iota http://laktek.com/2012/01/12/learning-go-constants-iota/#comments Wed, 11 Jan 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/01/12/learning-go-constants-iota When you learn Golang, your first encounter with constant declarations can be little confusing.

type ByteSize float64
const (
    _ = iota  // ignore first value by assigning to blank identifier
    KB ByteSize = 1<<(10*iota)
    MB
    GB
    TB
    PB
    EB
    ZB
    YB
)

Why only the first constant (KB) has a value assigned? What does iota means? These are some of the questions that could pop into your mind when you go through the above code sample for the first time. As I mentioned in the previous post, best way to find answers to such questions is to refer the Go spec.

Constant Expressions

In Go, constants can be numbers, string or boolean values. Basically, constant values can be represented by a literal(eg. "foo", 3.0, true) or by an expression(eg. 24 * 60 * 60). Since constants are initiated at the compile time, constant expressions should be composed only from constant operands.

If a constant value is a literal or evaluated from an expression containing only untyped constant operands it is an untyped constant. If you want, you can explicitly specify a type for a constant at the time of declaration.

const typed_size int64 = 1024
const untyped_size = 1024

var f float64 = typed_size // will give you a compile error
var f float64 = untyped_size // valid assginment

As shown in the above example, if you try to assign a typed constant to a variable of a different type it will give a compilation error. On the other hand, an untyped constant is implicitly converted to the type of variable at the assignment.

Declaring Constants

You can declare multiple constants in a single line as a comma-separated list. However, you should always have same number of identifiers and expressions. This means,

 const a, b, c = 1 //invalid assignment

is invalid.

You should write it in this way:

 const a, b, c = 1, 1, 1 //valid assignment

If you feel this is too repetitive, you can try the parenthesized form of constant declaration. Most interesting property of parenthesized form is you can omit an expression to an identifier. If you do so, the previous expression will be repeated. So the above example can be re-written in parenthesized form like this:

 const(a = 1
       b
       c
      )

Here constants b and c will also take the value 1.

const (
    Yes = 1
    True
    No = Yes >> 1
    False
)

In the above example, constants Yes and True gets the value 1; while No and False gets 0. Also, note that we can use a previously defined constant in an expression to declare another constant.

What is Iota?

A practical usage of constants is to represent enumerated values. For example, we can define days of the week like this:

const (
  Sunday = 0 
  Monday = 1
  Tuesday = 2
  Wednesday = 3
  Thursday = 4
  Friday = 5
  Saturday = 6
)

Iota is a handy shorthand available in Go that makes defining of enumerated constants easy. Using iota as an expression, above example can be re-written in Go as follows:

const (
  Sunday = iota
  Monday
  Tuesday
  Wednesday
  Thursday
  Friday
  Saturday
)

Iota value is reset to 0 at every constant declaration (a statement starting with const) and within a constant declaration it is incremented after each line(ConstSpec). If you use iota in different expressions in the same line they will all get the same iota value.

const(
 No, False, Off = iota, iota, iota // assigns 0
 Yes, True, On                    // assigns 1
)

Finally, here's a little tricky one. What would be the value of c?

const (
    a = iota // a == 0
    b = 5    // b == 5
    c = iota // c == ?
)

In the above example, c will take the value 2. That's because value of iota is incremented at each line within a declaration. Eventhough, the 2nd line doesn't use the iota value it will still get incremented.

]]>
Learning Go http://laktek.com/2012/01/05/learning-go http://laktek.com/2012/01/05/learning-go/#comments Wed, 04 Jan 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/01/05/learning-go This year I'm going to try a new programming language - Go. I had this notion that compiled, type based languages are overly complex and reduces developer efficiency. However, after doing some reading about Go, it appeared to take a different path from the rest and felt like something worth trying.

Acquainting a programming language is a journey. First few steps you take with it will define your perception about it. These first few steps went well for me with Go and it felt like a good fit for my repertoire. I thought of sharing my learning experience, hoping it will help others who want to learn Go.

Getting the start

Installation of Go involves some steps, but it's well documented. If you followed the steps correctly, you should have the Go compiler installed without any issue.

As recommended in the official site, I started learning the basics with the in-browser tour of Go. It covers the essentials of Go with good examples. Once you are done with the tour, I recommend reading the Go Tutorial and Effective Go, which has a good coverage on how to write idiomatic Go code.

Make sure you configure your text editor to support Go syntax highlighting before moving on to coding. Here's a comprehensive list with details on how to configure Go for various editors. Apart from that, Go comes with a command-line utility called gofmt, which standardizes the format of your Go files by doing things such as removing unnecessary brackets, applying proper indention and spacing. As recommended in this thread, I added the following lines to my .vimrc to invoke gofmt every time a file is written on Vim, so I always have well-formatted Go code.

  set rtp+=$GOROOT/misc/vim
  autocmd BufWritePost *.go :silent Fmt

Reference

Being a new language (and terribly named), Google search isn't that much of a help when coding Go. I recommend to keep the language spec at you disposal and always refer to it when in doubt. I hope more resources would be available as the language gains more popularity.

One of the best features of Go is its rich standard library(or default packages). You will find default packages for most common tasks such as handling HTTP requests, image manipulation, cryptography and encoding/decoding JSON. Most of these packages contains good documentation and package documentation also links to source files of the package, making it easier to read the code.

You will find lot of third-party packages written in Go from the package dashboard. I found reading third-party packages' source code is a good way to discover best practices and styles involved with Go. Let me point you to two such good packages to read - One is Google's Snappy compression format implemented in Go and the other one is a wrapper to GitHub's issues API written by Justin Nilly.

Organizing Code

In Go, you have to organize your code using packages. Every file in a Go program should belong to a package. During the compilation, compiler will combine all files with same package clause together. There's no requirement to keep the files of same package in the same directory. You have the flexibility to physically organize your files in a way you prefer. However, you can have only one package clause for a file.

Inside a package you can declare variables, constants, types and functions. All these top level declarations will be scoped to the package. If you want expose an identifier to outside, you must start its name with a capital letter. In Go this is known as exporting and there's no concept of access modifiers, which's one thing less to worry when coding Go.

To use a package in another place you must import it. Imports are scoped to the file, so if you use a package on different files they must all individually import it, even if they all belong to the same package.

Here's an example of a typical Go file. Note the package clause, import and exporting of function identifier.

  package foo

  import "bar"

  var my_var = "baz"

  func Baz() string {
    return my_var 
  }

Identifiers & Declarations

Identifiers can only contain unicode letters, digits and underscore (_). This means identifiers such as awesome? is invalid, but you can have identifiers such as as µ (using unicode letters).

In Go, you can declare multiple identifiers using expressions. Since functions can also return multiple values, we can have expressive statements like this:

  var x, y = getCoords()

Imagine if you are only concerned about the y value in the above context and doesn't actually need x. You can assign the blank identifier(_) to the values you are not interested. So the above example can be modified to:

  var _, y = getCoords()

You can also group declarations with brackets.

  var (
        name string 
        age = 20
  )

Apart from using var to declare variables, there's also a short-hand form using :=. Here you can omit the type and let it to be deduced at the runtime from the expression. Short-hand form can only be used inside functions (and also in some cases as initializers for if, for and switch statements). You can also declare multiple variables with the short-hand expression. Another interesting thing here is you are allowed to redeclare variables in multi-variable short-hand declarations if there's at least one new variable in the declaration. Here's an example:

  func name() string {
    first_name := "John" 
    first_name, last_name := "Peter", "Pan"

    return first_name + " " + last_name 

  }

Functions & Methods

As I mentioned earlier, functions in Go can have multiple returns. In idiomatic Go, main purpose of multiple returns is for error handling. You define one parameter as the result and other parameter as the error. Caller of the function should check for the value of error parameter and handle if there's an error. Let me explain this with an example:

  func getFile() (file *File, error os.Error) {
    ...
  } 

  func process() string {
    file, error := getFile()

    if error != nil {
      print("Error occurred in retrieving the file.")
      return ""
    }

    // do something with the file

    return result
  }

Every function in Go must always end with a return statement or a panic statement or have no return values. You cannot simply return inside a control statement such as if or switch (Update: As Jeff Wendling mentioned in comments you can do this, if you have a return statement also at the end). Some may call this a language feature that helps to the code more obvious, but the Go core team accepts this as a known issue: http://code.google.com/p/go/issues/detail?id=65

Go do also have methods. But unlike in other languages where methods are bound to objects, in Go methods are bound to a base type (actually, Go doesn't have a concept of objects). Basically, a method is same as a function with a explicit receiver defined as the first argument. Note that you can define methods only for the types declared in the same package.

Here's an example on how methods can be defined and called.

  type mystring string

  func (s mystring) capitalize() string {
    ...
  }

  func main() {
    var str mystring = "paul"
    print(str.capitalize())
  }

String & Character Literals

Similar to C, Go has both character literals and string literals. Character literals should be written inside single quotes ('a' or '\u12e4').

String literals can be in two forms. One is known as the raw form, where you write the string inside back quotes (`abc`) and the other is known as the interpreted form, where the string is written inside double-quotes ("abc"). Strings in raw form can span multiple lines, while interpreted string must be in a single line. In raw form if you write `\n` it will be output as it is, whereas in interpreted form it will treated as in the context of character literals (escapes properly and creates a new line).

Don't confuse interpreted-form to string interpolation you find in other languages. Closest thing to string interpolation in Go is fmt.Sprintf function.

  fmt.Sprintf("The value is %v", 15) //output: The value is 15

Control Flow

There are two ways to do control flow in Go - using if and switch statements. You can provide an initializer statement before the conditional expression. These expressions need not to be wrapped in parenthesies.

Branches in if statements should always be written inside blocks(enclosed by {}). Go doesn't support ternary operator (?:) or single line if statements as in other languages. When writing an else statement it should always be in the same line with closing curly backet of the previous if branch.

  if a := getScore(); a > 500 && game_over == true {
    print("You have a high score!")
  } else {
    print("You score is low!")
  }

Cases in switch statements has implicit break in Go. There's no default cascading through case statements (Update: As Jeff Wendling mentioned in the comments you can use fallthrough keyword to cascade through cases). If you want multiple cases to provide same behavior you may define them in a comma separated list. Also, in switch statements you can omit the expression if you want, such instances are evaluated as true.

  switch color {
    case "red": print("danger")
    case "green", "yellow", "blue": print("normal")
  }

Compiling

Go's compiler follows a no bullshit approach and quite adamant about the structure of your code. There's no such a thing as warnings in Go compiler. If it finds an issue it won't just compile. Importing packages and not using them, declaring variables and not using them will stop compiler from compiling your code. At the beginning, you will feel such nitpickings are a hindrance, but as you get used to it you will feel you are writing more clean code.

In this post, I touched only the surface of Go programming. In future posts, I'm planning to dig deeper covering topics such as slices, interfaces, channels, goroutines and testing.

]]>
Reviewing My 2011 http://laktek.com/2011/12/30/reviewing-my-2011 http://laktek.com/2011/12/30/reviewing-my-2011/#comments Thu, 29 Dec 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/12/30/reviewing-my-2011 It's almost the end of 2011. Now it's a good time to review how things went in last 12 months and make some resolutions to the next 12 months.

From the start, I expected this to be a challenging year. This year, I turned 25 and was working full-time (no more formal education). So it was like moving on to a new chapter in life. From the beginning, I had few personal goals for the year. Here's a summary of how I progressed with them during the year.

Blogging

Blog Traffic Graph

I've been running this blog since 2006, but it was only this year it actually broke out of ruts. Traffic wise still this is nothing spectacular, but I like how it evolved with my experience.

This year, I didn't post frequently, but when I did I tried to come up with something interesting to read. I enjoy blogging because it is a great way to learn. From gathering points to a post, presenting it to suit the audience and discussions that takes place afterwards helps to expand my knowledge on the topic.

Also, I redesigned the blog and switched to a new platform during this year.

Fitness

Fitness Graph

Every time I started doing workouts in the past it didn't last for more than couple of days. Either due to exhaustion or lack of motivation I simply didn't continue. But in last June I decided to give it another try; this time with a much simpler and relaxed schedule (based on couch to 5k plan). So far I've been able to stick to the schedule and increment the flow gradually (dip in October was due to bad weather). I should mention RunKeeper, which provided a great assistance by tracking my progress and motivating me to stick to the schedule.

I was never an athletic person, thats why I'm so delighted with this progress!

Spending

Expense Graph

I believe the best way to live a happy life is not earning more, but spending less. Spending less doesn't mean living a meager life. It's about knowing when and for what to spend. When we launched the Expense Tracking for CurdBee in last July, I started using it to track and analyze my own expenses.

Soon I was able to apply some tweaks to my life which not only helped me to reduce expenses but also have some long term benefits. I made a major saving by relying more on public transport and walking instead of spending on fuel (walking is a more efficient and sane option in Colombo, where roads are always congested). Also, I tried to sticking only to 3 home cooked meals, instead of eating out or having junk food as fillers. When it comes to buying stuff, I avoided making impulsive decisions and concerned more on the quality than the price. Paying more for the quality ensured less troubles and long term use.

What are my goals for 2012? I shall continue on the starts I got in 2011. I want to keep raising my bar.

]]>
Sugarless - A Functional & Context Oriented way to write JavaScript http://laktek.com/2011/12/21/introducing-sugarless http://laktek.com/2011/12/21/introducing-sugarless/#comments Tue, 20 Dec 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/12/21/introducing-sugarless Fundamentally, JavaScript is a functional programming language. It's built with the concepts of higher-order functions, closures and lazy evaluations. It also has objects with protoypal inheritance. Unlike in Object Oriented languages, JavaScript's objects don't have methods. Instead, JavaScript executes functions in context of objects. When you call this inside a function it returns the reference to its current context.

Try running following expressions in your browser's console.

  var testFunc = function(){ return this; }
  testFunc()

  //returns reference to window object

  var obj = new Object();
  obj.testFunc = testFunc;
  obj.testFunc()

  // returns reference to obj

As you can see same function referenced to two different this values on the two occasions. Moreover, we can explicitly set the this value (or the context) when we invoke a function.

  var testFunc = function(){ return this; }
  var obj = new Object();

  testFunc.call(obj);

  // returns reference to obj

I think most of us finds JavaScript confusing because we tend to think from a Object Oriented mindset. It's more pragmatic to think JavaScript from functions and contexts. However, lack of expressivity in the language itself limits this line of thought.

What is Sugarless?

Sugarless is a more expressive way to write functional and context-oriented programs in JavaScript.

Imagine a case where you have an input which has to undergo certain operations to produce a meaningful output. We define each operation as a function. The common way to do this in JavaScript would be:

  var output = truncate(trim(sanitize(input)), 200)

At a glance, it's not very easy to comprehend. It gets further complicated if we try to add more functions or remove a certain function.

If we think in terms of Object Oriented concepts we can try to make this more readable with a fluent interface (ie. method chaining).

  var output = input.sanitize()
                    .trim()
                    .truncate(200);

In order to do this in JavaScript, we must make sure the functions are available in the input object's prototype chain, object itself is mutable and all functions, apart from the last function returns the input object. Though this can be done, it's doesn't feel very natural or flexible.

Using Sugarless, this is how we can write it in a more readable and flexible manner:

  var output = Sugarless(input)(
     sanitize          
   , trim             
   , truncate, "", 200   
  );

Sugarless will invoke each function with the this value set to input (context) and first argument set to the return value of the previous function. Only thing we need change in functions is to use the this value as the input when no argument is passed (you can find the full example here).

What happens under the hood?

It's not hard to understand the logic behind Sugarless. When you call Sugarless with an object, you are creating a new context. Then you can define functions (and arguments) to run under this context.

  Sugarless(context_object)(
    function(){ }, arg1,.., argN 
  , function(){ }
  ...
  ...
  , function(){}
  );

In pure JavaScript terms, Sugarless(obj) is a function which returns a function. The returned function accepts any number of arguments (first argument should always be a function). Sugarless will invoke the functions passed with this value set to the context object(obj). Also, if you defined any non-function values after a function they are passed as arguments to the function. However, if the previous function in the chain returns a value, it will override the first argument of the current function.

Sugarless' Powers

Along with the context and function chaining Sugarless gives you a bunch of nifty features to organize your code better. I'll highlight some of the most interesting ones here. To learn about all features available please refer to the README.

Sugarless provides a mechanism to do deferred execution. By default, all functions in the chain will execute sequentially. However, if you call sugarless.next() inside one of the functions, it will halt the chain and return the next function to be executed. If you are making an asynchronous call, you can pass the next function as a callback and resume the chain when you receive the results.

    Sugarless(obj)(
       function() { setTimeout(Sugarless(this).next(), 60) }
     , function() { console.log("second function") }
     , function() { console.log("third function") }
    );

You can provide optional before and after callbacks to a context, which will be invoked before and after the chain respectively. This can be really useful if you want to create wrappers around Sugarless.

    Sugarless('{"name": "John"}', {
        before: function(){ return JSON.parse(this); }
      , after:  function(obj){ return JSON.stringify(obj); }
    })(
         function(obj) { obj.profession = "Programmer"; return obj; }
       , function(obj) { obj.favorite_food = "Pizza"; return obj; }
    );

Sugarless make sure not to invoke any functions when the context is null or undefined. Instead it simply return a null as the result. This behavior is somewhat similar to Null Object Pattern you find in Object Oriented programming. Further, you have the option to specify a fallback context in the event of a null context.

Here's an example of providing a polyfill for navigator.geolocation for the browsers that doesn't implement it (you can see the full example here).

  var customGeolocation = {};

  Sugarless( navigator.geolocation, { 
      fallback: customGeolocation
  })(
    function(){}
  , function(){}
  );

Bottom-up Programming

Using Sugarless will encourage you to build your solution by separating behavior into focused and untangled functions. For example, here's how we have defined the initial contexts in the Todo List example:

      $("ul#pending_todos")(
        Store.fetch
      , List.add
      );

      $("form#new_entry")(
        onSubmit, [ Form.captureInput, Task.save, Form.reset ] 
      );

We instruct to fetch tasks from the datastore and populate the pending todos list. For the new entry form, we have defined certain behaviors to invoke when a user hits submit. Note how each function focus only on doing one small task and how context holds them together perform the bigger task. This approach makes it easier to understand the behaviors and even allows you to refactor a particular behavior without affecting others.

Since these functions takes the context in to consideration, reusability also becomes very easy. For example, later we also wanted to show a list of completed todos apart from the pending todos. It behaves very similar to the pending todos list other than having a strikethrough in its text.

    $("#completed_todos")(
        Store.fetch
      , List.add
    );

We reused the same Store.fetch and List.add functions to populate the completed todos list as well. From the context, the functions know which todos to fetch and which list to add those.

For the brevity of this post, I'm not going to explain the full implementation here. I suggest you to check the source, which is pretty much self-explanatory.

Give it a try...

This is just the initial public release of Sugarless. There can be bugs, ambiguous parts and ton of possible improvements. So I appreciate all your feedback & contributions to make it better. Best way to start is to write some code with it.

You can install it from NPM by running npm install sugarless or you can grab the source from GitHub. If you want to learn more, check the examples and also read the spec.

Personally, Sugarless gave me a better grip of JavaScript and I hope it will feel the same for you too. However, I don't expect it to be everyone's cup of tea either. Choice of tools largely depends on the personal taste and experience :)

Last but not least, I should thank Nuwan Sameera for being my sidekick in this project & rest of the Vesess Team for their invaluable feedback and support.

]]>
After Graduation http://laktek.com/2011/12/15/after-graduation http://laktek.com/2011/12/15/after-graduation/#comments Wed, 14 Dec 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/12/15/after-graduation One year after graduation, I hear many of my friends already reminiscing the days as an undergrad and about the freedom we had that time to live our lives the way we want. Yes, it was the time we never feared to put overselves against extreme challenges, found love at hopeless places and fought to change the world!

But do we really loose this freedom to live our lives the way we want as we graduate?

As I see, the following 3 decisions you take after graduation will define everything about you and your future.

  1. What you do to live?
  2. Where do you live?
  3. With whom you live?

Everyone will not have the same options and conditions when it comes to making these decisions. So you cannot simply compare how smart or correct one's decisions to another.

Most important thing is to be conscious about yourself when making these decisions. Never try to go with the flow or let others to make these decisions for you. Then at some point in life when you experience the consequences (both bitter and sweet), you know those are results of your own decision making.

Keep control of your life and live the way you want!

]]>
Basic Patterns for Everyday Programming http://laktek.com/2011/11/23/basic-patterns-for-everyday-programming http://laktek.com/2011/11/23/basic-patterns-for-everyday-programming/#comments Tue, 22 Nov 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/11/23/basic-patterns-for-everyday-programming For most of you the patterns mentioned below should be nothing new. These are very basic stuff we slap into our code everyday and at times feels they are actually code smells than smart patterns. However, I've been doing some code reviewing lately and came across many code that lacks even these basic traits. So I thought of writing them down as a help for novice developers who would want to get a better grasp at these.

These patterns are commonly applicable in most general purpose programming languages, with slight syntactical changes. I use Ruby and JavaScript for the examples in this post.

Verify object's availability before calling its methods or properties

In ideal world we expect every call to an object will return that object, but in real world either object or null is returned. If we try to invoke a method without knowing it could be a null object, exceptions will be raised and at worse unexpected results would be returned.

The simple way around for this is to verify object's availability before calling its methods or properties. We connect the object and its method (or property) call with a logical AND operator (&&), so the method is only invoked if the object is truthy (not null). This technique is commonly known as 'andand'.

Here's a real-life example in JavaScript, where we use the native JSON object to parse a string:

  var parsed_content = window.JSON && window.JSON.parse("{}");

If the native JSON object is not present in the window context parsed_content will be set to null undefined.

Some languages has built-in shorthand methods for this pattern. If you are using Ruby 1.9 Ruby on Rails (2.3+) framework, Object.try serves for the same purpose. Which means:

  @person && @person.name

can be written as:

  @person.try(:name)

Set a default value with assignments

When we want to assign a value for a variable, the value returned could actually be null or undefined. On such a instances it's better to assign a default value. This minimizes the surprises later in the code and simplifies the conditional logic involving that variable.

To assign a default value in a single assignment statement, we can use logical OR operator (||), which assigns the latter value if former is falsy.

Here's a simple example in Ruby:

  @role = @person.role || "guest"

Gotcha: Be aware of the contexts where your variable could take a boolean value. Default value will be returned even when expected value is legitimately false.

Checking whether a variable equals to any of the given values

Imagine an instance where you have to perform an action if the current_day is either Monday, Wednesday or Friday. How would you check that condition?

I've seen many write this as:

  if(current_day == "Monday" || current_day == "Wednesday" || current_day == "Friday") 
    # perform action
  end

It does the job, but as you see it's verbose. What happens if we have to mix another condition with this in future (eg. also check whether the calendar date is above 20)?

A better way to do this is collecting the given values to an Array and checking against it. Here's the modified code:

  if(["Monday", "Wednesday", "Friday"].include?(current_day)) 
    # perform action
  end

Same example can be written in JavaScript like this:

  if(["Monday", "Wednesday", "Friday"].indexOf(current_day) >= 0){
    // perform action
  }

Extract complex or repeated logic into functions

When you have long, complex logic in condition or assignment statements, extract them into functions. It improves the code clarity and also makes refactoring lot easier.

Here's a slightly modified version of previous example:

  if(["Monday", "Wednesday", "Friday"].include?(current_day) && (current_date > 20)) 
    # perform action
  end

We can extract this logic into a function and call it like this:

  def discount_day?
    ["Monday", "Wednesday", "Friday"].include?(current_day) && (current_date > 20)
  end

  ...

  if(discount_day?) 
    # perform action
  end

This refactoring allows others to read the code in context to the domain, without having to comprehend the internal logic.

Doing similar refactoring should be possible in every language.

Memoize the results of repeated function calls

Another advantage of extracting logic into functions is you can memoize the result of calculation, if it's going to be called repeatedly.

Here's how simple memoization works in Ruby.

  def discount_day?
    @discount_day ||= (["Monday", "Wednesday", "Friday"].include?(current_day) && (current_date > 20))
  end

Let's try to understand what happens here. At the first call, @discount_day instance variable is undefined; hence the assignment block is evaluated and its result is assigned to @discount_day. But on the next call, since the @discount_day is already holds a value its value is returned without evaluating the assignment block.

Let's see how to do similar in JavaScript:

  var discount_day;
  function discount_day(){
    if(typeof discount_day === "undefined"){
      discount_day = (["Monday", "Wednesday", "Friday"].indexOf(current_day) >= 0 && (current_date > 20))
    }
    return discount_day
  }

Each language may have their own way doing memoization, refer to your language and take the advantage of it.

]]>
Why and How I Revamped My Blog http://laktek.com/2011/11/17/why-and-how-i-revamped-my-blog http://laktek.com/2011/11/17/why-and-how-i-revamped-my-blog/#comments Wed, 16 Nov 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/11/17/why-and-how-i-revamped-my-blog Update: I've migrated this blog from Jekyll to Punch. Here's the detailed blog post.

If you are a regular visitor of my blog you will notice this has undergone a significant revamp (if you are new here, this looked similar to any other Wordpress blog). Actually, I've been working on this revamp for quite a some time and must say I'm really satisfied with the final outcome.

There were several main intentions behind this revamp. First one was to make it pleasurable to read on mobile. I tend to read a lot in mobile, but 99% of the sites I visit render as crap on mobile. Thanks to Readability bookmarklet or Instapaper I get to read them on mobile, but with limitations. With these services you loose the the engagement with the original site. Also, mobile readers tend to drop important bit of information such as code blocks, images and tables in the process of scraping. I wanted to make sure at least my blog can be read on mobile without using any additional steps or tools.

When I originally started this blog 5 years ago, it was powered by Wordpress and hosted on Dreamhost. This was pretty much the preferred setup during that time, but over the years it became too troublesome to maintain. Wordpress became constant target of attackers and there were important security releases almost every day. If you miss one, you are busted! Once I nearly got removed from the Google index thanks to such an attack. In Wordpress, even a simple template customization means you are diving in a pool of spaghetti soup. Apart from that, shared hosts like Dreamhost have become slow, inconvenient and simply worthless for what you pay. This setup was driving me crazy and wanted to get rid of it.

Mobile First Design

Having worked a lot in backend and client-side scripting, this was my first attempt in designing an entire site with HTML5 and CSS3. Actually, thanks to the modern browser support frontend designing has become more interesting and fun. I actually managed to mock this layout entirely in HTML without even touching an image editor. Beauty of HTML mocks is they are interactive and you will always know how it will render in the browser.

Since I want to make this optimized for mobiles, I started with a mobile first design. The process was immensely simplified thanks to 'HTML5 Boilerplate' and '320 and Up' responsive stylesheets. Mobile first design expects you to identify the most important elements of the UI and then extend them for the larger screens.

I used the Voltaire as the typeface for main title and hand-drawn Shadows Into Light was used as the typeface for sub title. Both of these fonts were freely available and was embedded using Google Fonts. Header background was spiced up with a CSS3 gradient (generated from here).

All icons used in this design are in SVG format. Advantage of using SVGs is they can be scaled according to the viewport. I found this really cool, especially when dealing with multiple resolutions. To get SVGs work across all browsers I load them via RaphaelJS. All icons used here are from The Noun Project collection and Raphel icons.

All posts in this blog follows the semantical guideline provided by Readability. It is based on the HTML5 elements and hNews microformat.

Homepage spreads to 3-columns in large screens

Powered by Jekyll

I prefer to use Vim for all my writing (from code to emails). I love how it let me keep the focus without getting into the flow. Also, I feel comfortable using Markdown for formatting than a WYSIWYG editor. Earlier, I used to draft my posts this way and paste them in Wordpress; but it would have been awesome if I could publish them directly from the command-line. Jekyll, a static site generator by Tom Preston-Werner does exactly that.

Since Jekyll is based on Ruby and uses Liquid templating system, customizing and extending is also very easy. However, I didn't need to do much customizations to convert my existing Wordpress blog to Jekyll.

Hosted in Amazon S3

As I mentioned before, I was fed up with dealing with shared hosts, yet I couldn't make up my mind to go for a VPS just to host a blog. With Jekyll, there's no server-side logic involved. The site is generated locally and what I needed was a place to host the HTML files and other static resources. Amazon S3 fits for this purpose beautifully. Converting any S3 bucket to serve a website is easy as ticking a box in AWS console (you can read more details on AWS blog).

I had to map the domain laktek.com to S3 endpoint with a CNAME record. One downside of this is you will loose the ability to maintain email addresses from this domain (a better way would have been to use a subdomain for the blog, but I had all my link juice from this domain).

Currently, I use the Jekyll-S3 gem to push the generated site to the S3 bucket. Stil it doesn't have a mechanism to push only the updated files, but I'm hoping to have a better syncing strategy in future.

Handling Dynamic Pages

In Wordpress there are dynamic index pages by dates (/2011/10) or tags (/tag/code). To have this behavior in Jekyll it seemed I need to generate static pages for each date or tag. I didn't like this idea, as it could make the building and deploying process even slower with all the additional pages. I was able to come up with a small hack to handle the dynamic pages.

In Amazon S3, you can specify an error page to display when it cannot find the given resource. I'm using the archive page as the error endpoint, which renders a list of all the posts. Then with the help of a simple JS script, I filter the page to match the requested URL.

To get a better idea of what's happening here, try visiting following pages - http://laktek.com/2009 or http://laktek.com/tag/code.

Managing it from the Cloud

Most of the time, I do work and publish my blog posts from my local machine. But that doesn't mean I can use only the local machine to build my blog. I have a setup, which enables me to build it from any machine.

I keep the templates and generator in a git repository and push the changes to GiHub (Yes, you can reuse the source, but don't just rip the site entirely). All the posts and sensitive configuration details are stored in Dropbox. If I can access these resources I can build my blog from any machine. In future, I'm planning to offload the entire build process to cloud by hiring Amazon EC2 Spot instances.

Also, I should mention about Nocs, a free iPhone app which I use to edit the posts in Dropbox with Markdown syntax. This is really convenient way to do quick edits to posts and jot down new ideas for future posts.

]]>
Thank You, Steve! http://laktek.com/2011/10/06/thank-you-steve http://laktek.com/2011/10/06/thank-you-steve/#comments Wed, 05 Oct 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/10/06/thank-you-steve It's still early morning here in Sri Lanka; I was just starting the day passively glancing over my Twitter feed. In a stream filled with rants over iPhone 4S' form factor and predications on how Siri is changing the world, seeing this tweet just shocked me. It feels like the passing of someone closer to my life. Yes, Steve was terminally ill, but who'd have thought it will come to him so soon?

I admire Steve Jobs not because of Apple, but for the constant inspiration he gave throughout in his life. As a kid, seeing what Bill Gates has achieved, I believed you need to be an extraordinaire gifted with talent and luck to become successful and change the world.

However, listening to Steve's Standford graduation speech changed my perspective. I started to believe anyone can change their destiny, if they got the true passion and perseverance.

Steve's life was not bed of roses. As a child he was adopted, he had to drop-out from the University, was kicked out from his own company and lived the best years of his life battling with Pancreatic cancer. Yet he managed to "put a ding in the universe".

Thank you Steve for all the inspiration! May your soul rest in peace!

P.S. - Please Consider donating to Pancreatic Cancer Action Network, to fight against the disease that took Steve's life.

]]>
Creating asynchronous web services with Goliath http://laktek.com/2011/08/24/creating-asynchronous-web-services-with-goliath http://laktek.com/2011/08/24/creating-asynchronous-web-services-with-goliath/#comments Tue, 23 Aug 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/08/24/creating-asynchronous-web-services-with-goliath Recently, I've been working on improving the performance of CurdBee API. There were certain highly consumable end-points which also had tight coupling to external resources. I wanted to extract these endpoints out of the main app to cut the cruft and improve the throughput.

This require to turn them into bare-metal services which can utilize asynchronous processing. Weighing on the amount of code we can reuse, it was better to stick with a Ruby implementation rather than switching to a specialized evented library such as Node.js. However, implementing something like this in a Ruby is a challenge, because Rack interface itself is not designed to be asynchronous.

Luckily, there are couple of ways to solve this problem. The most convincing solution I found, was to use the Goliath framework from PostRank labs. It implements a fully asynchronous stack, which includes a web server and a Rack-compatible API. Goliath hides the complexity of asynchronous processing from the developer. With Goliath, you can continue to write your code in the traditional top-down flow avoiding "callback spagetti".

Goliath's Magic Secret

Goliath serves to requests using a EventMachine loop. For each request, Goliath creates a Fiber, a continuation method introduced in Ruby 1.9. A Fiber is paused or resumed by EventMachine callbacks on IO operations.

Goliath handles all this by itself without needing the developer involvement. Which means developers are free to write code following the traditional top-down flow.

Goliath also implements a API closely related to Rack and ships common Rack middlewares modified for asynchronous processing. So from the outset, writing a Goliath app is very similar to writing any other Rack apps.

Writing a Goliath app

Since Goliath depends on Fibers, you will require Ruby 1.9+ to write and deploy Goliath apps. Once you have setup Ruby 1.9 in your system, you can use gem install goliath to get Goliath.

If you have used other Rack Frameworks like Sinatra before, grasping Goliath's conventions would be very easy. A simple Goliath app is a ruby file with a class extending from Goliath::API. As in Sinatra, the file name should be the lower-case class name (for example, if your class name is MyApi then your file should be saved as my_api.rb).

Goliath ships bunch of common & frequently used middleware, re-written in asynchrnous manner. If you want to use any common middleware, make sure you use the Goliath equivalent. Also, if you want to write any custom middleware for your app, check Goliath documentation for guidance.

Your endpoint class extending from Goliath::API must implement a method named response, which should return an array consisting the status, headers and body (similar to a response in Rack).

Simple Goliath API implementation, will look like this:

require "rubygems"
require "bundler/setup" #using bundler for dependencies

require 'goliath'
require 'mysql2'

class MyApi < Goliath::API
  use Goliath::Rack::Params             # parse query & body params
  use Goliath::Rack::Formatters::JSON   # JSON output formatter
  use Goliath::Rack::Render             # auto-negotiate response format

  def response(env)
    #check the auth key
    if(!params.include?('auth_key') || params['auth_key'] != auth_key)
      [401, {}, "Unauthorized"]
    else
      response = {"total": 0} 

      [200, {}, response] #since we use the JSON formatter middleware, output will be formatted as JSON
     end
  end

end

Goliath supports the convention of multiple environments and provides a simple mechanism for environment specific configurations. You will have to create a config directory in app path and inside it you should create a file with the same name as your Goliath API, which defines the configurations.

Very important thing you should always keep in mind is Goliath use a Fiber to process a request. Unlike Threads, Fibers are not preempted by a scheduler, which means a Fiber gets to run as long as it wants to. So if you use any blocking IO calls it will lock the process, defeating the whole purpose of using Goliath. When you are picking IO libraries make sure they are written in asynchronous fashion. Most common libraries do support asynchronous processing; for example Mysql2 gem supports asynchornous connections via EventMachine & Fibers.

Here's how you can configure Mysql2 driver in Goliath to work in a non-blocking manner:

require 'mysql2/em_fiber'

environment :development do
  config['db'] = EM::Synchrony::ConnectionPool.new(:size => 20) do
                                      ::Mysql2::EM::Fiber::Client.new(:host => 'localhost', 
                                      :username => 'root',
                                      :socket => nil,
                                      :database => 'myapi_db',
                                      :reconnect => true)
                 end
end

On a database query, Goliath will pause the Fiber allowing other requests to be processed and will resume when it gets the results.

Deploying with Capistrano

We are using Capistrano to deploy Goliath apps to production. Process is similar to deploying other Rack apps and we use the railsless-deploy gem to avoid Rails specific conventions in Capistrano. Here's how our Capfile and deploy recipe look like:

Capfile:

require 'rubygems'
require 'railsless-deploy'
load    'config/deploy'

deploy.rb:

require 'capistrano/ext/multistage'

set :stages, %w(staging production)
set :default_stage, "production"

# bundler bootstrap
require 'bundler/capistrano'

deploy/production.rb:


#############################################################
#    Application
#############################################################

set :application, "myapi"
set :deploy_to, "/home/app_user/apps/myapi"

#############################################################
#    Settings
#############################################################

default_run_options[:pty] = true
ssh_options[:forward_agent] = true
set :use_sudo, true 
set :scm_verbose, true
set :keep_releases, 3 unless exists?(:keep_releases)

#############################################################
#    Servers
#############################################################

set :user, "app_user"
set :domain, "0.0.0.0"
set :password, "app_pwd"

server domain, :app, :web
role :db, domain, :primary => true

#############################################################
#    Git
#############################################################

set :scm, :git
set :deploy_via,    :remote_cache
set :branch, "master"
set :repository, "ssh://git@repo_server/repos/myapp.git"

namespace :deploy do

  desc "Restart the app after symlinking"
  task :after_symlink do
    try_sudo "god restart myapi"
  end

end

Monitoring Goliath with God

Goliath is also a standalone app server, so you don't need any other Ruby app server to run Goliath apps. However, we use God to monitor the Goliath process for memory leaks and automatically restart on failure. Here is the God config we are using for Goliath:

app_path = '/home/app_user/apps/myapi/current'
app_env = 'prod'
ruby_path = '/usr/bin/ruby' #change this if you are using RVM

God.watch do |w|
  # script that needs to be run to start, stop and restart
  w.name          = "my_api" 
  w.interval      = 60.seconds

  w.start         = "cd #{app_path} && #{ruby_path} my_api.rb -e #{app_env} -p 9201 -d" 

  # QUIT gracefully shuts down workers
  w.stop = "kill -QUIT `cat #{app_path}/goliath.pid`"

  w.restart = "#{w.stop} && #{w.start}"

  w.start_grace   = 20.seconds
  w.pid_file      = "#{app_path}/goliath.pid" 

  w.uid = 'app_user'
  w.gid = 'app_user'

  w.behavior(:clean_pid_file)

  w.start_if do |start|
    start.condition(:process_running) do |c|
      c.interval = 60.seconds
      c.running = false
    end
  end

  w.restart_if do |restart|
    restart.condition(:memory_usage) do |c|
      c.above = 300.megabytes
        c.times = [3, 5]
      end

    restart.condition(:cpu_usage) do |c|
      c.above = 50.percent
      c.times = 5
    end
  end

  w.lifecycle do |on|
    on.condition(:flapping) do |c|
      c.to_state = [:start, :restart]
      c.times = 5
      c.within = 5.minute
      c.transition = :unmonitored
      c.retry_in = 10.minutes
      c.retry_times = 5
      c.retry_within = 2.hours
    end
  end
end

Proxying with Nginx

Finally, there should be a way to forward HTTP requests Goliath process(s). This can be done by setting up a simple Nginx proxy.

upstream myapi { 
 server localhost:9201; 
} 

server { 
 listen 80; 
 server_name myapi.myapp.com; 

 location / { 
   proxy_pass  http://myapi; 
 }

Hope this information helps you to get started with Goliath, for further details please check the Goliath website.

]]>
A Month in Vienna http://laktek.com/2011/06/08/a-month-in-vienna http://laktek.com/2011/06/08/a-month-in-vienna/#comments Tue, 07 Jun 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/06/08/a-month-in-vienna I'm writing this post from 30,000ft above the ground, flying back home from Vienna.

I came to Vienna last month, as per an invitation from GENTICS Crew (the company behind awesome Aloha Editor) to hack on a really interesting project. I will not reveal much details about the project for now, since it's still little too premature for that (don't worry, you will get to hear lot about it in near future).

In this post, I like to share how it was to live in the world's most livable city for one month.

From the moment I landed in Vienna, I was impressed about the convienence and reliabilty of the public transport. Trams (Street Cars), Subway Trains and Buses operates normally till midnight and they are so frequent. Once you get used to the subway system moving around the city is piece of cake. Entrances to underground stations are clearly visible from distance and prominent color codes are used for each underground line. Also, you can travel on any public transport service with a single ticket (which would be valid for a day, week or month). Riding in public transport is also very comfortable, there's no rush even during the peak hours.

Vienna is a master piece of great architecture and town planning. It's amazing how they have managed to preserve the traditional architectural styles throughout the city. Not only the chapels, palaces, museums and theaters, but every building in the city has its own glory.

There are lush green parks and pathways throughout the city. Walking through these pathways in an evening or sitting in a bench to read on a weekend, can be the best luxurious you can experience in your life. Another interesting fact about Vienna is, you can drink the tap water. It is so pure, as it comes fresh out of mountains.

You can't talk about Vienna without talking about its food. You have to experience the Viennese Schnitzel, Tafelspite, Melange (Viennese Cappuccino) and the great range of wines. The ice-creams and chocolates in Vienna are simply overwhelming for someone like me, who have a sweet tooth.

Most of the people I met in Vienna were extremely smart and broad-minded. I guess they have inherited this from the knowledge driven culture and appreciation of arts. This free culture allows anyone to live respectably, regardless of their gender, religion, ethnicity, profession or sexual orientation. The city is so safe and I didn't see a single news on crime during my stay.

Also, it's interesting to note most of the people in Vienna are obessed with reading. In public transport, almost everyone got a paper, book or kindle in their hands.

As I see Vienna has become an immaculate city, not because of its past glory, wealth or the technology, but from the great discipline you find in the people. People in Vienna are so disciplined in the way they work, travel and even have fun. When you do your due responsibly, it not only raises your own living condition, but also of others.

I would like to wrap up this post with an interesting quote from my caretaker. He used to say this every time he served the breakfast - "Cooked with love and served with charm!". That's how I cherish Vienna.

]]>
Introducing jQuery Smart AutoComplete... http://laktek.com/2011/03/03/introducing-jquery-smart-autocomplete http://laktek.com/2011/03/03/introducing-jquery-smart-autocomplete/#comments Wed, 02 Mar 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/03/03/introducing-jquery-smart-autocomplete Few months ago, we did a major revamp in CurdBee UI. In this revamp we decided to make use of autocomplete on several data heavy inputs to improve the usability. We assumed any existing JavaScript autocomplete plugin could be easily modified to suit to our requirements.

Unfortunately, that wasn't the case. None of those plugins were comprehensive and flexible enough to cover the cases we had in our hand. Here are some of the issues we faced with the existing plugins:

  • They introduced whole bunch of new dependencies.
  • They didn't support custom filtering algorithms.
  • You cannot modify the HTML of results (also, styling the plugin generated markup becomes a nightmare).
  • You cannot specify what to do when there are no results.

So I had to write a custom jQuery based autocomplete plugin to suit the requirements of CurdBee. Later, I used this same plugin on several other hobby projects with slight modifications.

At this point, it made me realize this could be something that can be useful for others too. After couple of weeks of free-time hacking, I'm ready to introduce you to the Smart Autocomplete plugin!

First of all, I suggest you to check different use cases of the plugin by visiting the demo page. If those examples pumps you, read on!

Basic Usage

Basic Autocomplete Screenshot

Using Smart Autocomplete on your projects is easy. Only dependency it has is for jQuery core library. Make sure you have the latest jQuery core version (1.5 or above), if not you can download it from here.

To get the Smart Autocomplete plugin visit its GitHub page.

Once you have downloaded both jQuery and the Smart Autocomplete plugin, reference them in the header of your page like this:

  <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.5.1/jquery.js" type="text/javascript"></script>
  <script src="jquery.smart_autocomplete.js" type="text/javascript"></script>

Now, define an input field to use for autocomplete.

  <div>
    <label for="fruits_field">Favorite Fruit</label>
    <input type="text" autocomplete="off" id="fruits_field"/>
  </div>

To enable autocompletion on the text field, you must select it using jQuery and call the smartAutoComplete method on it:

  $(function(){

   $('input#fruits_field').smartAutoComplete({
    source: ['Apple', 'Banana', 'Orange', 'Mango']
   });

  });

As you can see in the above example, only required option is the source. Source defines the set of values that will be filtered when user starts typing on the field. Source can be either an array or a string specifying a URL. If you specify a URL, filter function will send an AJAX request to that URL, expecting a JSON array as a response (check Example 2).

You will also need to add couple of styles to display the results correctly.

    ul.smart_autocomplete_container li {list-style: none; cursor: pointer; margin: 10px 0; padding: 5px; background-color: #E3EBBC; }
    li.smart_autocomplete_highlight {background-color: #C1CE84;}

Once you complete the above steps, your text field will have autocomplete capabilities.

Power of One-liners

Though Smart Autocomplete comes with sensible defaults, you can easily customize the default behavior by setting couple of one-line options.

If you check the Example 2, you will notice the autocomplete field is defined as follows:

    $("#country_field").smartAutoComplete({ 
      source: "countries.json", 
      maxResults: 5,
      delay: 200,
      forceSelect: true
    });

Apart from the required source field, it has 3 other options defined. The maxResults option sets the maximum number of results to be shown; delay option sets number of milliseconds plugin should wait (after user enters a key) before calling the filter function.

By setting forceSelect option to true, you can block free-form values in the autocomplete field. This means user have to either select a value from the given suggestions or the field will reset to blank.

You must have seen how Google Instant Search completes rest of the phrase in gray, with the best matching result. It's possible to implement similar behavior with Smart Autocomplete too.

typeAhead screenshot

  $("#type_ahead_autocomplete_field").smartAutoComplete({
    source: ['Apple', 'Banana', 'Orange', 'Mango'],
    typeAhead: true
  });

All you got to do is to set typeAhead option to true. Isn't that easy?

You can find all available options of the plugin in the README.

Define your own Filtering Algorithm

Smart Autocomplete plugin gives you the flexibility to override the built-in filter function with a custom function. Custom filter function should return an array of results or a deferred promise, which returns an array on resolve. If you call jQuery Ajax methods, those will return an object containing a promise by default.

In the 5th example, we use Quicksilver like filtering to filter the names of the Jazz musicians.

Jazz Musicians Screnshot

We use the JavaScript port of Quicksilver string ranking algorithm, which adds score() method to String prototype, to filter the items.

Here's how the smartAutoComplete method is called on the field, with our custom filter function:

  $("#jazz_musicians_field").smartAutoComplete({
        source: [
          "Scott Joplin",
          "Charles Bolden",
           //snip 
        ],

        filter: function(term, source){
            var filtered_and_sorted_list =  
              $.map(source, function(item){
                var score = item.toLowerCase().score(term.toLowerCase());

                if(score > 0)
                  return { 'name': item, 'value': score }
              }).sort(function(a, b){ return b.value - a.value });

            return $.map(filtered_and_sorted_list, function(item){
              return item.name;
            });
          }

        });

Do more with Events

Smart Autocomplete was written following an event-driven approach and therefore it emits events at the every major step in its process. You can bind handlers to these events similarly to other jQuery Events. Also, it's possible to cancel the default behavior on events by calling ev.preventDefault() method. This makes extension and customization of Smart Autocomplete so easy.

In example 6, we make use of the evented model of Smart Autocomplete to build multi-value selector.

Multi Select Screenshot

Let's see how the implementation is done:

  $("#textarea_autocomplete_field").smartAutoComplete({source: "countries.json", maxResults: 5, delay: 200 } );
  $("#textarea_autocomplete_field").bind({

     keyIn: function(ev){
       var tag_list = ev.customData.query.split(","); 
       //pass the modified query to default event
       ev.customData.query = $.trim(tag_list[tag_list.length - 1]);
     },

     itemSelect: function(ev, selected_item){ 
      var options = $(this).smartAutoComplete();

      //get the text from selected item
      var selected_value = $(selected_item).text();
      var cur_list = $(this).val().split(","); 
      cur_list[cur_list.length - 1] = selected_value;
      $(this).val(cur_list.join(",") + ","); 

      //set item selected property
      options.setItemSelected(true);

      //hide results container
      $(this).trigger('lostFocus');

      //prevent default event handler from executing
      ev.preventDefault();
    },

  });

As shown in the above code, we have handlers bound to keyIn and itemSelect events. Lets try to understand the roles of these two handlers.

Smart Autocomplete plugin stores the context data on an event using a special smartAutocompleteData object. Default actions use these context data on its execution. For example, default keyIn action, gets the query parameter from the smartAutocompleteData object of the event.

To implement multi-value select, we need to set the last phrase entered (text after the last comma) to the query parameter, which is passed to default keyIn action. The custom handler we've defined for keyIn event does that by overriding the smartAutocompleteData object's query field.

The itemSelect handler we defined, will concatenate the selected item to the field. However, it also executes ev.preventDefault() to skip the default action from running.

Apart from the two events we've used in the example, there are set of other events available to use. You can find the complete list in the README.

Digging Deeper

In this post, I only highlighted the most important features of the Smart Autocomplete plugin. You can learn more about plugin's capabilities by reading the README and also the Specs (which is available in specs/core/ directory of the plugin)

As always, I encourage you to fork the plugin's code from the Github repo and modify it for your requirements. If you feel your changes can improve the plugin, please feel free to send a pull request via GitHub.

Also, if you run into any issues in using the plugin, please report them to GitHub issues.

]]>
Understanding Prototypal Inheritance in JavaScript http://laktek.com/2011/02/02/understanding-prototypical-inheritance-in-javascript http://laktek.com/2011/02/02/understanding-prototypical-inheritance-in-javascript/#comments Tue, 01 Feb 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/02/02/understanding-prototypical-inheritance-in-javascript Behavior reuse is one of the key aspects of Object Oriented programming. Many mainstream Object Oriented languages, achieves behavior reuse by using class based inheritance. In class based inheritance, a class defines how objects stemming from it should behave.

However, not all languages use class based inheritance to achieve behavior reuse. The best possible example is JavaScript. It doesn't have a concept of classes. Many developers often get confused about JavaScript's object oriented capabilities due to this fact. But in reality, JavaScript is a more expressive and flexible Object Oriented language compared to some of the mainstream languages.

If JavaScript doesn't have class based inheritance, how does it reuse the behavior? For that it follows the technique called Prototypal Inheritance.

In prototypal inheritance, an object is used to define the behavior of another object. Let's try to understand this with a simple example:

    var father = {
     first_name: "James", 
     last_name: "Potter",
     hair_color: "black",
     is_good_at_quidditch: true,

     name: function(){
      return this.first_name + " " + this.last_name
     }
    }

    var son = {
     first_name: "Harry" 
    }
    son.__proto__ = father;

    father.name()
    >> James Potter

    son.name()
    >> Harry Potter

    son.hair_color
    >> black

    son.is_good_at_quidditch
    >> true

Here the 'father' object acts as the prototype for 'son'. Hence, 'son' inherits all properties defined for 'father' (Note the proto property of 'son' object was explicitly overridden to set 'father' as the prototype).

Even though it was used as a prototype, 'father' object can be still manipulated as a regular object. This is the main difference of a prototype from a class.

Object Hierarchy

The process of object responding to a property call in JavaScript is fairly straight-forward. It will first check whether it defines the property on its own; if not it will delegate the property call to its prototype object. This chain will continue to the top of object hierarchy until the property is found.

Talking about the object hierarchy, all objects in JavaScript are descended from generic Object. The generic Object prototype is the default prototype set on all objects at the instantiation, unless a custom prototype object is defined.

So any given inheritance hierarchy in JavaScript is chain of objects with the generic Object prototype at the root.

Creating New Objects

Though JavaScript doesn't have classes, you can define a constructor function and call it with the new keyword to instantiate a new object. As I mentioned before, when the new object is created it uses the generic Object prototype as its prototype.

Let's take an example of creating basic shape objects. The constructor takes the number of sides and vertices as the arguments.

var Shape = function(sides, vertices){
  this.sides = sides; 
  this.vertices = vertices; 
}
var triangle = new Shape(3, 3);

What if we want to create different types of triangles? Yes, we can use our basic shape object as the prototype for all our triangle objects.

var Triangle = function(angles, side_lengths){
  this.angles = angles || [60, 60, 60]; 
  this.side_lengths = side_lengths || [5, 5, 5]; 
}
Triangle.prototype = new Shape(3, 3);

var isosceles_triangle  = new Triangle([70, 70, 40], [5, 5, 10]);
var scalene_triangle  = new Triangle([70, 60, 50], [5, 10, 13]);

isosceles_triangle.sides
>> 3

isoceles_triangle.vertices
>> 3

scalene_triangle.sides
>> 3

scalene_triangle.vertices
>> 3

Basically, when you call a constructor function with the new keyword; it will set the proto property of the newly created object to the object defined in prototype property of the constructor function.

Modifying Prototype Object at Runtime

All Objects in JavaScript can be modified during the runtime. Since prototype objects are also regular objects, we can modify them too. However, when you modify a prototype object its changes are reflected to all its descended objects too.

  Triangle.prototype.area = function(base, height){
    return(1/2 * base * height);
  }

  isosceles_triangle.area(10, 4); 
  >> 20

What's most interesting is we can use this way to extend the built-in objects in JavaScript. For example, you can extend String object's prototype to add a capitalize method.

    String.prototype.capitalize = function(){
      return this.charAt(0).toUpperCase() + this.slice(1);
    };

    "john".capitalize();
    >> John

Further Reading

If you like to learn more about JavaScript's object model and prototypal inheritance, you would find following articles/posts useful.

  • Details of the object model (MDC Doc Center)
  • Inheritance revisited (MDC Doc Center)
  • Classical Inheritance in JavaScript (by Douglas Crockford)
  • Prototypal Inheritance in JavaScript (by Douglas Crockford)
  • Simple “Class” Instantiation (by John Resig)
  • ]]> jQuery isBlank() http://laktek.com/2011/01/07/jquery-isblank http://laktek.com/2011/01/07/jquery-isblank/#comments Thu, 06 Jan 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/01/07/jquery-isblank One of my favorite syntactic sugar methods available in Rails is the Object.blank?, which evaluates to true if the given object is false, empty, or a whitespace string. It makes your conditional expressions more readable; avoiding the use of boolean operators.

    It would be cool if we can have the same convenience when writing client-side code with JavaScript. Unfortunately, jQuery Core doesn't have such a utility function. Closest you get is with the jQuery.isEmptyObject. It would return true for null, undefined or empty objects and empty arrays; but you can't match whitespace strings with it (which are of course not empty objects).

    So, I wrote this small jQuery plugin to check whether the given object is blank:

    (function($){
      $.isBlank = function(obj){
        return(!obj || $.trim(obj) === "");
      };
    })(jQuery);
    
    $.isBlank(" ") //true
    $.isBlank("") //true
    $.isBlank("\n") //true
    $.isBlank("a") //false
    
    $.isBlank(null) //true
    $.isBlank(undefined) //true
    $.isBlank(false) //true
    $.isBlank([]) //true

    As shown in the above examples, it would identify any object that evaluates to false (null, undefined, false, []) or a whitespace string as blank.

    Update: jtarchie commented on this gist suggesting a alternative method, which would even match the empty objects.

    ]]>
    Looking Back at 2010 http://laktek.com/2010/12/31/looking-back-at-2010 http://laktek.com/2010/12/31/looking-back-at-2010/#comments Thu, 30 Dec 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/12/31/looking-back-at-2010 It was a year of transition in my life. Some of the highlights of 2010 were:

    1. Completed my degree and marked the end of formal education.
    2. Saw CurdBee becoming a more established product (We shipped loads of new features in 2010).
    3. Completed another successful Google Summer of Code (probably my last) with OpenNebula project.
    4. Got to learn lots of new stuff (asynchronous processing, real-time communication, etc) by working on Realie project, which was my final year project in University.
    5. Increased traffic to my blog by 300% (but I should have blogged more)
    6. Bought my first car.
    7. Finally, Bought an iPhone :).

    Lots of interesting things have already planned for 2011. It's surely going to be a more challenging year ahead.

    ]]>
    Building Modular Web Apps with Rack & Sinatra http://laktek.com/2010/12/22/building-modular-web-apps-with-rack-sinatra http://laktek.com/2010/12/22/building-modular-web-apps-with-rack-sinatra/#comments Tue, 21 Dec 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/12/22/building-modular-web-apps-with-rack-sinatra Working on OpenNebula's Administration tool in last Google Summer of Code, was one of the best development experiences I had during 2010. The project has been successfully completed and awaiting to be released with a future version of OpenNebula.

    In this post, I would like to give some insights on its development, since I believe it stands as a good case study on how to build modular web apps, especially using Rack & Sinatra.

    Background

    Main objective of OpenNebula's Admin Tool is to enable easy management & monitoring of your OpenNebula cloud setup via a web based interface. Basically, this includes management of users, hosts, virtual networks and virtual machines(VM). It's planned to be extended further to offer features like infrastructure checks, installation and configuration tweaking of an OpenNebula setup (which are already in development).

    Also, it is expected to be self hosted, interfacing to an OpenNebula front-end server. It interacts with the OpenNebula using its Ruby API.

    In order to achieve these requirements, the application needed to be modular, self-contained and easily customizable. If we used an opinionated framework like Rails, we would be spending majority of the development time on tweaking the framework for the problem domain, rather than focusing on the problem domain itself. However, on the other hand, building such fully featured app from the scratch within a 3-month timeline was not also realistic.

    In this background I started exploring the possibilities of using a mini-frameworks(web DSLs), specifically Sinatra. From my mentor, I got to know that they have used Sinatra for certain parts of the OpenNebula project. So it was a safe bet to try for this context.

    Since, Sinatra inherently follows the concepts of Rack, its rich middleware stack can be used to bridge the functionality of the apps.

    Collection of Mini-Apps

    Looking at the overall app, it is composed of loosely coupled resource modules, which have minimal interaction or dependency between them. This made possible to contain each resource module in it's own mini app; which means adding, removing or customization of a module can be done without affecting the behavior of others.

    class HostsApp < Sinatra::Base
    
      #define the model
      require 'models/host'
    
      #define the views (based on mustache)
      register Mustache::Sinatra
      require 'views/hosts/layout'
      set :mustache, {
        :views => 'views/hosts',
        :templates => 'templates/hosts'
      }
    
      set :sessions, true
    
      get '/list', :provides => :json do
        Host.all.to_json
      end
    
      get '/list' do
        @hosts = Host.all
        @flash = flash
    
        unless @hosts.include?(:error)
          mustache :list
        else
          puts "Error: "+@hosts[:error]
          "<h1>Oops..An error occurred.</h1><p>#{@hosts[:error]}</p>"
        end
      end
    end

    Above, is an simplified example of how a mini-app is defined. It extends Sinatra::Base class and follows an explicitly defined MVC pattern. API calls are wrapped in a separate model class, while output generation is done using a Mustache based view templates. So it is basically similar to a controller in Rails.

    In above code block, you may notice there are two routes defined for GET /list path. Only difference is one route has a condition: provides. Which means it only responds to requests accepting JSON as the content type. This way we can offer different response types for same resource (i.e. an API) in Sinatra.

    Template Rendering

    As I mentioned earlier I used Mustache for generating views of the project. This was also the first time I used Mustache and I was really hooked with its flexibility.

    Mustache defers from traditional language specific templating schemes, by defining it's own logic-less template format. This makes it possible to reuse the same template on different contexts. For example, in this project I used the same template for server-side rendering with Sinatra and also again on client-side (with JavaScript), when data are loaded via a AJAX.

    Exploring Mustache's capabilities in detail would take a post of it's own, so I leave it for a future post.

    Routing

    In order to form a one high-level application, individual mini-apps with different end-points, needed to be mapped to a single address space.

    For this purpose, I used Rack::Mount, library written by Josh Peek, which also powers Rails3' default routing. It simply routes requests to individual Rack apps based on the path.

    This is how the route set for the Admin Tool looks like (which I hope is self-explanatory):

    # route paths to different apps
    Routes = Rack::Mount::RouteSet.new do |set|
      set.add_route UserSessionApp, { :path_info => %r{^/user_session*} }, {}, :user_session
      set.add_route HostsApp, { :path_info => %r{^/host*} }, {}, :host
      set.add_route VirtualNetworksApp, { :path_info => %r{^/vnet*} }, {}, :vnet
      set.add_route VirtualMachinesApp, { :path_info => %r{^/vm*} }, {}, :vm
      set.add_route UsersApp, { :path_info => %r{^/user*} }, {}, :user
      set.add_route DashboardApp, { :path_info => %r{^/$} }, {}, :dashboard
    
      #public file routes
      set.add_route Rack::File.new(File.dirname(__FILE__) + "/public"), { :path_info => %r{^/public*} }, {}, :public
    end
    
    # run the routeset
    run Routes

    User Authentication

    Another important concern of this project was how to enforce user authentication. Admin console access needed to be restricted by the login credentials defined by One Client of OpenNebula.

    There are several authentication middleware libraries available for Rack. Out of those, Warden seems to be the most flexible and well documented. Ability to define custom authentication strategies easily, also made it more suitable for our requirement.

    This is how the authentication strategy based on one_client was defined using Warden:

    Warden::Strategies.add(:password) do
      def valid?
        params["user_name"] || params["password"]
      end
    
      def authenticate!
        u = get_one_client
        (u.one_auth == "#{params["user_name"]}:#{Digest::SHA1.hexdigest(params["password"])}") ? success!(u) : fail!("Could not log in")
      end
    end

    Another interesting thing about Warden is it only invokes when we explicitly calls it. Otherwise it just remains as an object in Rack environment, without getting in the way of application execution. In order invoke Warden, we can call it within a before filter in Sinatra. Request processing will continue or halt depending on the authentication result.

    before do
      #check for authentication
      unless env['warden'].authenticated?
        session["return_to"] = request.path
        redirect "/user_session/new"
      end
    end

    Other essential Rack Middleware

    There are couple of other Rack middleware, that were used in this project, which provides some of the essential conveniences we have in Rails.

    One such middleware is Rack::NestedParams (available in Rack Contrib package), which is used handle nested form parameters properly. Also, Rack::Flash is useful, which gives the option of adding flash messages (success, errors and warnings) to the app.

    Source Code

    You can view the full source code of the OpenNebula's Admin Tool from its repository at http://dev.opennebula.org/projects/one-admin-tool/repository

    ]]>
    The Best Role Model of Our Time http://laktek.com/2010/07/23/best-role-model-of-our-time http://laktek.com/2010/07/23/best-role-model-of-our-time/#comments Thu, 22 Jul 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/07/23/best-role-model-of-our-time "Who is the biggest role model of your life?" My answer to that question would be Muttiah Muralitharan. I know that answer would confuse most of you. You would expect a geek like me to name some one like Linus Torlvards, Yukihiro Matsumoto or Sergy Brin and Larry Page as a role model. But how come a cricketer be my role model??

    Murali

    In an era where vanity role models are hyped to the top by mass media, Murali stands out from the rest by his own feet. He is great not only for his phenomenal performances in the cricket field, but for his character. None of us could ever emulate his unique bowling action. But there are certain things that we can try to emulate from Murali's character.

    A Human!

    It was somewhere in 1998. As a kid, I had rather unusual hobby of collecting cricket statistics. Those days, I didn't know the existence of Cricinfo or didn't even had a computer. I used to record all the scorecards of the matches played during that time in an exercise book. However, my record collection was never complete, I missed lot of score cards of the old matches. Then my uncle tipped me, that the Sri Lankan Cricket Board has a library, where they have all old Wisdens and "The Cricketer" magazines. I could use it to collect the missing match records. After lot of persuasion, I was able to convince him to take me to this place.

    Inside the Cricket Board, we had to go pass a gym to get to the library. I saw a very familiar face inside the gym. It was Murali! He was there with another cricketer (as I remember it was Ravindra Pushpakkumara). That was the first time I saw an international cricketer in real life. Murali also saw me and waved. Then I tried to go inside the gym, to get his autograph. The instructor of the gym was there. He told me not to disturb the players and didn't allow me to go inside. As I was turning back in disappointment, the most surprising thing happened. Murali came to the door and signed my autograph book!

    This was the period, where Murali was called for chucking for the second time in Australia and he was preparing to undergo medical tests to prove he's legibility. So while he signed my autograph, I told him how angry I'm with Australians for the unjust happened to him. Returning my autograph back, with a bright smile in his face Murali said in Sinhalese "owa ohoma tamai malli..."(These stuff happens).

    I couldn't believe how humble and down to earth this person was. He was ready to go out of his way to make some random, pesky kid happy. He could still afford to smile genuinely and take things lightly amidst of all the trouble he has been experiencing at that time. Even today, when I reminisce this incident, it feels like a dream.

    A Geek!

    For me Murali is a geek. He's not a geek who uses Linux as his primary OS, lurks in IRC or hacks micro-controllers. But his passion and obsession to the game of cricket, makes him a geek in that field.

    He's not only a geek, but a hacker. He changed the face of off-spin bowling. When off-spinner was about to go extinct from the game of cricket, Murali came and made it more challenging. He forked the doosra from Saqlain Mustaq and hacked it into a more lethal weapon. By being different from the rest, he created controversy.

    Also, Murali only focused on doing what he can do best. He didn't have to ride Lamborghinis, have affairs with bollywood actresses (but he's got a beautiful wife ) or get into politics to keep him in the limelight. He made the world talk about him and respect him by doing what he can really do - bowling.

    A Workaholic!

    When Murali first won his test cap for the Sri Lanka team, it was not the professional, winning outfit you find today. At that time, Sri Lanka was ranked only ahead of Zimbabwe and playing for the national team was not even considered as a profession. Due to the political instability of the country during that time, there were no certainty of the tours and there were no policies for player selection. The future was gloomy and had lots of risk involved. I would say the situation was analogous to working for a startup in the corporate world.

    He could have easily stayed in Kandy looking after his family business. Instead Murali took the challenge and came to Colombo to join the national team. It wasn't an easy start. His brilliance was not an one night wonder. As the stats show it took 27 test matches and 3 years to complete his first 100 wickets. During that time, he bowled full days, without much support from the other end and tasted heavy defeats.

    Murali persisted and perceived harder. As his performances improved, so did the Sri Lankan team's winning ratios. However, he focused not on his personal feats, but on his team's victory. He had no problems playing under different captains, even juniors to him like Mahela and Sanga. He delivered his best in all circumstances. He never let his personal ego to hinder his duty.

    In the last 18 years, he had been working like a horse. As of record, he bowled 33% of all overs Sri Lanka bowled at that time. He had always made himself available for national duty over other more lucrative engagements, such as country cricket in England.

    How many of us can have such dedication and commitment to our duty? How many of us would complain if we have to repeat the same old boring job? Murali was no such a person. When he was on field he seems to be enjoying the every moment of it. That should be the secret mantra of Murali's success. That's why I call him the best role model of our era.

    Hail Murali!

    (Photo credit: Wikimedia Foundation - http://upload.wikimedia.org/wikipedia/commons/d/d4/MuralitharanBust2004IMG.JPG)

    ]]>
    Handy Git commands that saves my day http://laktek.com/2010/06/04/handy-git-commands-that-saves-my-day http://laktek.com/2010/06/04/handy-git-commands-that-saves-my-day/#comments Thu, 03 Jun 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/06/04/handy-git-commands-that-saves-my-day There are 3 essential weapons every developer should have in their armory. A text editor, a terminal and a source code management system. Picking powerful, flexible and personalized tools will make your workflows more productive. As a developer, I use Vim for text editing, bash as the terminal and Git for source code management.

    Out of those, Git turns out to be the most fascinating piece of software to me. It's more than a SCM system. It represents a paradigm shift on the way we code. It's decentralized nature, gives freedom to experiment and innovate, without having worry about others' work. It brings sharing and collaboration of work to a new level. It's like the democracy in coding!

    Basic understanding of pull-commit-push cycle of Git may be sufficient for most daily ethos. However, there are plethora of other options in it which deserves some time for comprehension. Here are some of such commands, which I found useful and use in my regular workflows.

    git-rebase

    When I first started to use Git, my workflow used to be like this:

      git pull origin master
      git checkout -b branch_for_new_feature
      git status
      git commit -am "commit message"
      <--cycle from step 2-4, until my work is complete-->
      git checkout master
      git merge branch_for_new_feature
      git push origin

    However, on many occasions when I try to push the changes to remote server(origin), I will end up with the following error:

      ! [rejected] master -> master (non-fast forward) error: failed to push some refs to 'ssh://user@server:/repo.git'

    This is because my colleagues have also pushed to the remote server, while I was working on the new feature. At this point, I could simply do a git pull to update my local repo, which would merge the remote changes with the current HEAD. On most cases, this leads to a chain of conflicts, which requires manual resolution. Due to my laziness (or lack of concentration), I often end up deleting what was on the HEAD and replace it with the upstream changes. Mid way, I realize I was actually deleting the new updates on the HEAD that were supposed to be pushed to the remote server. From that point onwards, cleaning up the mess involves pulling out my hair and lot of swearing!

    Actually, there is a smarter way to do this. That is to use git-rebase. When you do a rebase, it saves all commits in the current branch that are not available in the upstream branch to a staging area, and reset the current branch to upstream branch. Then those saved commits would be reapplied on top of the current branch, one by one. With this process, it ensures my newest changes would remain as the newest.

    The new workflow with rebasing would be:

      git pull origin master
      git checkout -b new-feature-branch
      git status
      git commit -am "commit message"
      <--cycle from step 2-4, until work is done-->
    
      git checkout master
      git pull origin master #update the master before merging the new changes
      git checkout new-feature-branch
      git rebase master #apply the new changes on top of current master
    
      git checkout master
      git merge new-feature-branch
      git push origin

    Though it seems to be longer than the previous workflow, it helps me to stay away from unnecessary conflicts. Even if they do occur, resolution of them are pretty straight-forward, as I know for sure what is the newest change.

    git-cherry-pick

    While working on a new-feature-branch, I encounter quick fixes that are independent from the new feature; thus can be applied to master independently. Delaying the release of these fixes till new-feature-branch gets merged to master seems unnecessary. On such cases git-cherry-pick comes in handy. As the name implies, you can pick exactly one commit (by the ref-id) and apply it to another branch. To avoid conflicts, those commits should be self-contained patches. If it depends on another commit, you will require to apply it first.

    git-blame & git-show

    Some days, you wake up to find some one has changed your pet hack! Rather than blaming the cat for eating the code; you can easily find out who is the real culprit, by running:

        git-blame application.js

    It would return the author and last commit SHA for all lines in the file. You can even narrow down the output by specifying the starting and ending lines:

      git blame -L 450,+10 application.js

    If you want to do further investigation, such as why did the author actually made this change and what are the other patches he committed along with this, you can run:

      git show last_commit_sha

    git-bisect

    With git-blame you were able to track down the issues that is visible to the naked eye. But most freaking bugs are spread out and harder to detect at a glance.

    This is more common when you work as a team, each one would be working on modular sections and will have tests covering the code they write. Everything seems to be running smoothly, until you merge all modules together. Tests would start to fail and you are left with no clue what breaks the tests. On such instances, what we normally do is rollback the commits one by one, to find where it causes the trouble. But this can become a tedious process if you have large set of commits. In Git there's a handy assistant for this; it is the git-bisect command.

    When you specify a range of commits it will continuously chop them in halves, using binary-search until you get to the last known good commit. Typical workflow with git-bisect is as follows:

        git bisect start
        # your repo would be switched to a temporary 'bisect' branch at this point
    
        # you mark the current HEAD of the repo as bad
        git bisect bad
    
        # Then you set the last known good version of the repo
        git bisect good version-before-merge
    
        # This will start the bisecting process.
        # Commits from last good version to current version will be chopped in half.
    
        # Then you run your tests
        rake test
    
        # Based on output of the test you mark the commit as good or bad
        git bisect good/bad
    
        # Git will chop the next subset automatically, and return for you to test
    
        # Test and marking process, will continue until you end-up with a single commit,
        # which is supposed to be the one which introduced the bug.
    
        # When bisecting process is done; run:
        git bisect reset
    
        # You will be returned to your working branch.

    git-format-patch/git-apply

    Contributing to some open source projects is easy as sending a pull request via GitHub. But that's not the case with all. Especially in large projects such as Rails, you are expected to submit a ticket to the bug tracker attaching the suggested patch. You have to use the command git-format-patch to create such a patch, which others can simply apply to their repositories for testing.

    In the same way, if you want to test someone else's patches, you need to use the command git-apply.

    git submodule

    Git submodules allows you to add, watch and update external repositories inside your main repository. It's my preferred way of managing plug-ins and other vendor libraries in Ruby or Node.js projects.

    To add a submodule to your project, run:

        git submodule add repository_url local_path

    When someone else takes a clone of your repo, they will need to run;

        git submodule init
        git submodule update

    This will import the specified submodules to their environment. Deploying tools such as Capistrano has built-in support for git submodules, which will run the above two commands, after checking out the code.

    git help command

    Last, but not least I should remind you that Git has excellent documentation, which makes its learning process so easy. To learn the options and use cases of a certain command all you need to do is running:

        git help command

    Apart from the default man pages, there are enough resources on the web on Git including the freely available books: Pro Git and Git Community Book

    If you have any other interesting tips on using Git, please feel free to share them in the comments.

    ]]>
    Are you creating software to impress one person? http://laktek.com/2010/05/29/dont-create-software-just-to-impress-one-person http://laktek.com/2010/05/29/dont-create-software-just-to-impress-one-person/#comments Fri, 28 May 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/05/29/dont-create-software-just-to-impress-one-person Ever wondered why we have so much of crappy and bloated software? The root cause they are built just to impress one person.

    This is a widespread disease in software industry. At the academic level, you will find students writing software to impress their mentors and get the required credit. Then you get the developers working for large software firms, who are only concerned on how to get the nod of their pointy-headed bosses. Freelancers are only worried of getting the sign-off from their pesky clients. You may expect startups to their stuff out of passion. But in reality, most startups which runs on funding are building software just to impress their VCs. This system is plain wrong!

    There are no software intended to be used by one person. In most cases, the person who is getting impressed is not the actual end-user of the software. He may not have a clue of what end-user really wants.

    If you are a developer, think of the wider audience, who'd be actually using your stuff. Don't ignore them! Try to impress those people at the end of the day.

    ]]>
    Real-time Collaborative Editing with Web Sockets, Node.js & Redis http://laktek.com/2010/05/25/real-time-collaborative-editing-with-websockets-node-js-redis http://laktek.com/2010/05/25/real-time-collaborative-editing-with-websockets-node-js-redis/#comments Mon, 24 May 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/05/25/real-time-collaborative-editing-with-websockets-node-js-redis Few months ago, I mentioned I'm developing a real-time collaborative code editor (codenamed as Realie) for my individual research project in the university. Since then I did couple of posts on the design decisions and on technologies I experimented for the project. After some excessive hacking, today I've got something tangible to share with you.

    Currently, I have implemented the essentials for real-time collaboration including ability watch who else is editing the file, view others edits, chat with the other collaborators and replay how the edits were done. You may think this is more or less similar to what Etherpad had - yes, it is! However, this is only the first part of the project and the final goal would be to extend this to a collaborative code editor (with syntax highlighting, SCM integration).

    Web Sockets

    The major difference of Realie from other Real-time collaborative editors (i.e. Etherpad, Google Docs & Wave) is it uses web sockets for communication between client and server. Web Sockets perfectly suit for cases like this, where we need to have asynchronous, full-duplex connections. Compared to alternatives such as long-polling or comet, web sockets are really efficient and reliable.

    In traditional HTTP messages, every message needs to be sent with HTTP headers. With web sockets, once a handshake is done between client and server, messages can be sent back and forth with a minimal overhead. This greatly reduces the bandwidth usage, thus improves the performance. Since there is an open connection, server can reliably send updates to client as soon as they become available (no client polling is required). All this makes the app truly real-time.

    As of now, only browser to implement web sockets is Google Chrome. However, I hope other browsers would soon catch up and Mozilla has already shown hints for support. There are also Flash based workarounds for other browsers. For now, I decided to stick with the standard web socket API.

    Taking Diffs and Applying Patches

    In case if you wonder, this is how the real-time collaboration is done:

    1. when one user makes a change; a diff will be created for his change and sent to server.
    2. Then the server posts this diff to other connected collaborators of the pad.
    3. When a user receives a diff, his content will be patched with the update.

    So both taking diffs and applying patches gets executed on the client side. Handling these two actions on browser was made trivial thanks to this comprehensive library written by Neil Fraser.

    However, on some occasions these two actions needs to be executed concurrently. We know by default client-side scripts get executed in a single thread. This makes execution synchronous and slow. As a solution to this I tried using the Web Workers API in HTML5 (this is implemented in WebKit & Mozilla). Separate worker scripts were used for taking diffs and applying patches. The jobs were passed on to this worker scripts from the main script and the results were passed back to main script after execution was complete. Not only this made the things fast, but also more organized.

    Node.js for Backend Servers

    Initially, I started off the server implementation in Ruby (and Rails). Ruby happend to be my de-facto choice as it was my favorite language and I had enough competency with it. However, soon I was started feeling Ruby was not the ideal match for such asynchronous application. With EventMachine it was possible to take the things to a certain extent. Yet, most of the Ruby libraries were written in a synchronous manner (including Rails), which didn't help the cause. As an alternative, I started to play with Node.js and soon felt this is the tool for the job. It brings the JavaScript's familiar event-driven model to server, making things very flexible. On the other hand, Google's V8 JavaScript engine turned out to be really fast. I decided to ditch the Ruby based implementation and fully use Node.js for the backend system.

    Backend is consist of two parts. One for serving normal HTTP requests and other for web socket requests. For serving HTTP requests, I used Node.js based web framework called Express. It followed the same ideology as Sinatra, so it was very easy to adapt.

    Web socket server was implemented based on the recently written web socket server module for Node.js by Micheil Smith. If you are interested to learn more about Node.js' web socket server implementation please see my earlier post.

    Message delivery with Redis Pub/Sub

    On each pad, there are different types of messages that users send on different events. These messages needs to be propagated correctly to other users.

    Mainly following messages needs to be sent:

    • When a user joins a pad
    • When a user leaves a pad
    • When a user sends a diff
    • When a user sends a chat message

    For handling the message delivery, I used Redis' newly introduced pub/sub implementation. Every time a user is connected (i.e. visits a pad) there would be two redis client instances initiated for him. One client is used for publishing his messages, while other will be used to listen to incoming messages from subscribed channels.

    Redis as a persistent store

    Not only for message handling, I also use Redis as the persistent data store of the application. As a key-value store Redis can provide fast in-memory data access. Also, it will write data to disk on a given interval (also, there is a much safer Append only mode, which will write every change to disk). This mechanism is invaluable for this sort of application where both fast access and integrity of the data matters.

    Another advantage of using Redis is the support for different data types. In Realie, the snapshots are stored as strings. The diffs, chat messages and users for a pad are stored as lists.

    There is a well written redis client for Node.js which makes the above tasks really simple.

    Try it out!

    I'm still in the process of setting up an online demo of the app. Meanwhile, you can checkout the code and try to run the app locally.

    Here is the link to GitHub page of the project - http://github.com/laktek/realie

    Please raise your ideas, suggestions and questions in the comments below. Also, let me know if you are interested to contribute to this project (this project is open source).

    ]]>
    Apple is repeating the same mistakes from the past http://laktek.com/2010/05/21/apple-is-repeating-the-mistakes-from-the-past http://laktek.com/2010/05/21/apple-is-repeating-the-mistakes-from-the-past/#comments Thu, 20 May 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/05/21/apple-is-repeating-the-mistakes-from-the-past In the 1980s, Apple jumped out to an early lead in personal computers, but then got selfish. Steve Jobs, a notorious control freak, just could not play well with others. Along came Microsoft, with Windows, which was a knockoff of Apple's operating system. Microsoft partnered with everyone and today has 90 percent market share, while Apple's share lingers in the single digits. Today the battlefield is mobile devices, and just as before, Apple jumped out to an early lead. And just as before, Jobs got selfish. He won't support Flash, or any cross-platform tools—because he wants developers locked into his platform, and his App Store, where he collects a 30 percent commission. Daniel Lyons (Newsweek) - http://blog.newsweek.com/blogs/techtonicshifts/archive/2010/05/20/sayonara-iphone-why-i-m-switching-to-android.aspx#

    Clearly, Android is becoming the new Windows (or even better because it's Open Source). Same as in 1980s, where Microsoft knocked off Apple with an OS that would run on any platform, today it appears Google would do the same for mobile market.

    No matter how beautiful, people don't like stay inside walled gardens. Apple doesn't seem to learn this lesson.

    ]]>
    Implementing Web Socket servers with Node.js http://laktek.com/2010/05/04/implementing-web-socket-servers-with-node-js http://laktek.com/2010/05/04/implementing-web-socket-servers-with-node-js/#comments Mon, 03 May 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/05/04/implementing-web-socket-servers-with-node-js Web Sockets are one of the most interesting features included in HTML5 spec. It would open up a whole different paradigm in web application development by allowing asynchronous, long-lived connections between client and server. As Web Sockets were supported in Google Chrome's beta release, it signaled now the time to use it in your apps.

    However, WebSockets doesn't really go well with the traditional synchronized web server environments. Use of evented libraries such as Node.js for Web Socket servers seemed more practical and scalable. But the initial versions of Node.js's didn't have built-in support for Web Socket connections specifically. There were several Web Socket server implementations based Node.js, which overcame this problem by hijacking its HTTP module.

    Node.js still doesn't include WebSockets in its core modules as in other languages (i.e. Go language). However, the recent overhaul to HTTP module, have made implementation of web sockets with whole lot easy. Now Node.js's HTTP server module would emit "Upgrade" event, each time a client requests a http upgrade. This event could be trapped when implementing a web socket server.

    Here is a simple, low-level example for a Node.js based HTTP server, which supports both common HTTP requests and web socket connections.

    var sys = require("sys");
    var net = require("net");
    var http = require("http");
    
    function createTestServer(){
      return new testServer();
    };
    
    function testServer(){
      var server = this;
      http.Server.call(server, function(){});
    
      server.addListener("connection", function(){
        // requests_recv++;
      });
    
      server.addListener("request", function(req, res){
        res.writeHead(200, {"Content-Type": "text/plain"});
        res.write("okay");
        res.end();
      });
    
      server.addListener("upgrade", function(req, socket, upgradeHead){
        socket.write( "HTTP/1.1 101 Web Socket Protocol Handshake\r\n"
                    + "Upgrade: WebSocket\r\n"
                    + "Connection: Upgrade\r\n"
                    + "WebSocket-Origin: http://localhost:3400\r\n"
                    + "WebSocket-Location: ws://localhost:3400/\r\n"
                    + "\r\n"
                    );
    
        request_upgradeHead = upgradeHead;
    
        socket.ondata = function(d, start, end){
          //var data = d.toString('utf8', start, end);
          var original_data = d.toString('utf8', start, end);
          var data = original_data.split('\ufffd')[0].slice(1);
          if(data == "kill"){
            socket.end();
          } else {
            sys.puts(data);
            socket.write("\u0000", "binary");
            socket.write(data, "utf8");
            socket.write("\uffff", "binary");
          }
        };
      });
    };
    
    sys.inherits(testServer, http.Server);
    
    var server = createTestServer();
    server.listen(3400);

    There is a more high-level and elegant Web Socket Server library (http://github.com/miksago/node-websocket-server) in development by Micheil Smith. It is built utilizing the new HTTP library and compatible with the draft 76 of the Web Sockets spec (which includes bunch of security improvements).

    Here is an example, on how to implement a web socket server with the above mentioned library.

    var sys = require("sys");
    var ws = require('./vendor/node-websocket-server/lib/ws');
    
    function log(data){
      sys.log("\033[0;32m"+data+"\033[0m");
    }
    
    var server = ws.createServer();
    server.listen(3400);
    
    server.addListener("request", function(req, res){
      res.writeHead(200, {"Content-Type": "text/plain"});
      res.write("okay");
      res.end();
    });
    
    server.addListener("client", function(conn){
      log(conn._id + ": new connection");
      conn.addListener("readyStateChange", function(readyState){
        log("stateChanged: "+readyState);
      });
    
      conn.addListener("open", function(){
        log(conn._id + ": onOpen");
        server.clients.forEach(function(client){
          client.write("New Connection: "+conn._id);
        });
      });
    
      conn.addListener("close", function(){
        var c = this;
        log(c._id + ": onClose");
        server.clients.forEach(function(client){
          client.write("Connection Closed: "+c._id);
        });
      });
    
      conn.addListener("message", function(message){
        log(conn._id + ": "+JSON.stringify(message));
    
        server.clients.forEach(function(client){
          client.write(conn._id + ": "+message);
        });
      });
    });
    ]]>
    I'm with OpenNebula this Summer! http://laktek.com/2010/05/01/im-with-opennebula-this-summer http://laktek.com/2010/05/01/im-with-opennebula-this-summer/#comments Fri, 30 Apr 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/05/01/im-with-opennebula-this-summer I had the opportunity to get selected for Google Summer of Code on the freshman year itself in my academic life. The experience I gained in that summer working with SilverStripe project boosted my self confidence and helped me immensely to shape up my career.

    This year, which happens to be my final year as an undergraduate, I'm going to get yet another stint with Summer of Code. This time, it's with OpenNebula. OpenNebula is an Open Source toolkit for cloud computing. This project is relatively young and small, but something which could make a great impact for the future. In simple terms, OpenNebula lets you run your own cloud hosting service like Amazon EC2.

    As a developer, who makes use of cloud platforms, I really want to see open standards getting adopted among cloud service providers. Vendor lock-in is the biggest threat I see when moving to cloud based platforms. Projects such as OpenNebula are great initiatives to avoid this. OpenNebula is also one of the main supporters for the OCCI standard interface. I have been following their developments closely, from the first day I got to know about them.

    When I saw OpenNebula has been selected as a mentoring organization for the first time in this year's GSOC, I thought this would be a great chance for me to contribute to them. Another interesting thing about OpenNebula is they use Ruby as their main development language.

    In this summer, I will work on building a web based administration console for OpenNebula. I believe it will make OpenNebula more usable and increase its adoption. It would be something similar to AWS(Amazon Web Service) management console, but more rich in terms of the capabilities.

    Though, we are still in the brain-storming phase of the project, I could give a small hint that we would be using Sinatra and other Rack middleware to build this. So it's yet another chance to show the power and flexibility of micro-frameworks. I will be sharing my experiences during the project via this blog. I hope it would be really fun and exciting.

    Finally, big shout-out goes to my mentor, Jaime Melis, who was very supportive from the time of preparing the proposal. It's always a pleasure to work with someone like him, who is really passionate and knowledgeable.

    If you are interested in reading my full project proposal : http://docs.google.com/Doc?docid=0AeUIyatONYiTZGY3ZnYyZmNfNjljanJyZDNjNg&hl=en

    ]]>
    Building Real-time web apps with Rails3 http://laktek.com/2010/02/16/building-real-time-web-apps-with-rails3 http://laktek.com/2010/02/16/building-real-time-web-apps-with-rails3/#comments Mon, 15 Feb 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/02/16/building-real-time-web-apps-with-rails3 On deciding the web framework to build Realie, one of the main considerations was should I move to a totally asynchronous framework? Most established web frameworks, including my favorite Rails is built in a synchronous manner and follows a call-stack based model. Real-time web apps needs to be asynchronous. Evented programming model ideally suits to this.

    Since there were lot of hype around Node.js based async web frameworks in last couple of months, my initial idea was also to use such framework for my project. However, that felt as a totally new learning curve. Apart from grasping how to use JavaScript in server-side, it also meant I need to adopt to a totally new eco-system for templating, routing, etc.

    However, when I revisit my requirement it was clear only a part of the web app really needs to be asynchronous. Most parts can still be done with a traditional call stack based web framework. Using a fully async. web framework to build an entire app seems to be useless. Also, it felt an overkill to run two different apps to serve sync and async stuff.

    In this context, I came to know about Cramp, a Ruby asynchronous framework written by Pratik Naik. Best thing about Cramp is capability of using Rack middlewares (keep in mind not fully Rack compliant). Then came the option how about using Rails and Cramp together to build a hybrid of real-time web app. With Rails3, it makes it so easy to mix any other Rack endpoint with Rails. So this sounded a perfect solution to my problem.

    Since, Cramp follows evented model it needs an evented web server such as Thin or Rainbows!. Further, Cramp has implemented websockets support for these two server backends.

    Integrating Rails3 and Cramp

    First of all, you will need to bundle Cramp gem, with your Rails app. For this, open the Gemfile and add the following:

        gem "cramp", :require => 'cramp/controller'

    For my work I only needed the Cramp controller (it has also got an async model) so for now I only required it.

    As I mentioned earlier, to support web sockets, cramp needs to extend the web server we use. To specify the web server, I added an initializer (config/initializers/cramp_server.rb) with the following line :

        Cramp::Controller::Websocket.backend = :thin

    Then, I created a simple Cramp controller, which can respond to web sockets (app/cramps/communications_controller.rb)

        class CommunicationsController < Cramp::Controller::Websocket
            periodic_timer :send_hello_world, :every => 2
            on_data :received_data
    
            def received_data(data)
                if data =~ /stop/
                    render "You stopped the process"
                    finish
                else
                    render "Got your #{data}"
                end
            end
    
            def send_hello_world
                render "Hello from the Server!"
            end
        end

    Now the fun part! The new Rails3 router supports pointing to any Rack compatible endpoint, so we can easily hook our cramp controller for public access. In config/routes.rb add the following:

      match "/communicate", :to => CommunicationsController

    Our Cramp endpoint can co-exist with rest of the Rails controllers without any issues.

    Viola!

    Another important change with Rails3 is its also a fully compatible Rack app now. This means as any other Rack app, we can also start our Rails app by running rackup.

        rackup -s thin -p 3000 --env production

    This will start our app using Thin server backend on port 3000. Keep in mind, we need to provide an environment other than development, to avoid Rack Lint middleware. This is because Cramp is not a fully compatible with Rack SPEC and it will throw exceptions.

    ]]>
    Realie Project: Data Structure & Storage http://laktek.com/2010/02/11/realie-project-data-structure-storage http://laktek.com/2010/02/11/realie-project-data-structure-storage/#comments Wed, 10 Feb 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/02/11/realie-project-data-structure-storage Last couple of days, I found some time to work on my individual research project for the degree course. The topic area I selected for my project was on "Real-time Web". Real-time Web is just the opposite of the current way we use the web. Rather than we checking (polling) content providers for updates, content providers will feed (push) us with the updates. This concept is getting rapid adoption and I believe it would be the de-facto behaviour of the web in couple of years.

    The application I decided to build was a code editor based with real-time collaboration capabilities. Following the hacking traditions and for the ease of remembrance, this project was code named "Realie" (it was the zillionth time the name was changed and hope it would stick this time).

    Rather than developing the project behind closed doors and finally presenting a thesis (which would be utterly boring), I thought of building the project in open, by sharing the code and discussing design decisions with the community. I believe what it really matters is the experience and knowledge I could gather from this, rather than the final grade I would get for this.

    So lets start by looking at the initial data structure and storage decisions.

    What is Realie?

    Most of you may have heard or already using Google Wave & Etherpad. Realie also got inspired from those two projects. While, Google Wave & Etherpad are known to be real-time collaborative editors(or canvases) for general public, with Realie we try to cater the niche of hackers & developers. As we know development process is already a collaborative process, which involves lot of real-time communication & decision making. As developers we know this process is not seamless and painstaking. This is the void which Realie tries fill.

    Putting it to simple terms, Realie would be like pastie or gist where multiple people could view and edit at the same time. There would be more other jazzy features in the project, but this would be the essence of it. Imagine how such a tool would make your remote pair-programming, code reviewing & brainstorming sessions a breeze?

    Starting from the scratch

    As I was starting the project, Google acquired Etherpad and that action made Etherpad to release their code as open-source. Though, it sounded a perfect opportunity for me to fork their code and get my project done, I decided to start from the scratch. Etherpad is a grown project and they have already made certain design decisions. Adopting from them, wouldn't help me to gain any experiences on the implications of designing a real-time system or to explore better ways of doing things.

    One of the first challenges I had to face was on deciding the data storage mechanism I'm going to adopt for this project. My initial idea was to create each editing pad as a physical file in the system and track the changes each user would be doing using Git. This sounded very unrealistic as disk IO would be very slow and executing git commands via shell would be even more slower!

    Relational databases as storage medium doesn't seem to be a good for the task, since we should be doing massive writes and continuous querying. The best option in this scenario was to use a key-value based storage.

    Choosing Redis

    I got convinced to use Redis as the data store, after hearing lot of good about it and further seeing some impressive benchmarks.

    Redis is just beyond a normal key-value store, we could have lists, sets or sorted-sets as data structures in Redis and it's possible to do operations like sorting, taking difference, intersections and unions of the data. Also, one of the most interesting features I saw in Redis was it's persistence options. You can either use it as only a in-memory storage (which is very fast), or either write the data to disk periodically or even write data to disk only on a change (Append-only file)

    Data Structure

    The atomic data unit of Realie would be a Line. Each pad is composed of bunch of lines. A line would be also equivalent to a single edit a user make on a file (pad). When we are storing the lines, we will need to store the following attributes along with it - user, pad, content, position (line number in the file) and timestamp.

    In Redis, we can only store data values as strings. For this, each line will be serialized to JSON before being stored in the data-store. JSON serialization also makes it possible to consume & manipulate line contents in client or server easily.

    Since we need to keep references to a line in several places in the data-store, a unique SHA-1 hash based on the contents of the line is calculated and it's used as the key for that line.

    As I mentioned earlier, a Pad made as a collection of line. It's basically similar to any source files with lines of codes. Beyond that each pad will store the list of users who are working on that pad. Users will have the option to join a pad or leave a pad as they wish.

    For a pad, there are two basic views. One is snapshot view, which is the current state of the pad after applying the most recent changes made by the users. The interesting view would be the timeline view - which would have the changes in order of users made. This view can be used to generate the historical versions of the pad (at a given time or checkpoint) or even to create a playback to see how the pad was changed over time.

    Source Code

    You can checkout the code from following GitHub repository - http://github.com/laktek/Realie. Please note, currently the project source code only contains models & specs for above data structures and it could be changed, as you read.

    ]]>
    Understanding Election Results through Economic Theory of Democracy http://laktek.com/2010/01/28/understanding-election-results-through-economic-theory-of-democracy http://laktek.com/2010/01/28/understanding-election-results-through-economic-theory-of-democracy/#comments Wed, 27 Jan 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/01/28/understanding-election-results-through-economic-theory-of-democracy Though its been two days since the announcement of the results, still there is no end to the spread of speculations & rumors on the concluded Presidential Election in Sri Lanka. I don't have any strong political bias to either of the main two candidates and didn't want to accept anything reported in a blind eye.

    This morning, I came across an interesting Political Science literature written by Anthony Downs, named "An economic theory of democracy". I only could read the WikiSummary of it, but while going through it, my mind eventually mapped it to the context of the Sri Lankan Presidential Election. It helped me to dispel some of the doubts that were in my mind and understand how majority of the people would have voted.

    I thought of jotting them down here for others who are interested. Also, I hope someone, who is more knowledgeable on this subject would correct me if my line of thoughts was wrong (I haven't done any formal study on Political Science than casual reading of the stuff here and there).

    Downs defines basic logic of voting as follows:

    In a world of perfect information, each voter would compare his expected utility of having party A (incumbent) in government (for another term, that is) with the expected utility of having party B (opposition) in government. This utility differential would determine each voter's choice at the ballot box.

    In our case, party A would be current president Hon.Manhinda Rajapakse and party B is opposition's common candidate Gen.Sarath Fonseka.

    Further Downs mentions there are several factors that would matter to a typical voter and would modify the above model. Let's consider each of those factors.

    1. He doesn't really know what the future holds, so he doesn't know which party's rule will give him greater utility (in the future). So instead, he will instead compare the utility he got over the last term from party A with what he thinks party B would have provided under the same circumstances; if he thinks party B would have brought him more utility, he votes for B.

    I think from the first Presidential Election what mattered to most of the Sri Lankan average voter was the civil war with LTTE in north & east. Hence, the end of war, the biggest utility he got during the last term of party A (i.e. Mahinda Rajapakse). Though, Gen.Sarath Fonseka played a major role in war victory, I feel due to the other parties involved with him (and their broken promises on ending the war in the past), would have prevented majority from voting to party B over A.

    2. He doesn't just look at raw utility differential, though; he also considers the trend. Is A getting better or worse? If it is getting better, then the voter will forgive A for early failures to deliver utility.

    This is the point government (party A) did exploited within the last 2 months after the announcement of the election. Fy-over bridges, International Stadiums, Power Plants and all sorts of the other development efforts started to blossom within this period. Also, fuel prices and price of other essentials were brought down tactfully. Moreover government again used peace as a lucrative belief for future development. Voters would have consider this as a positive trend. While opposition clearly showed the government (party A) has failed to fulfill many of the promises in last election manifesto (Mahinda Chintanaya), voters seems to have forgiven for those failures.

    3. If A and B would have provided equal utility, the voter asks himself whether B would have used identical approaches and policies as A, or different ones? * If B would have been identical, the voter is indifferent and abstains.

    Apart from ending corruption and government wastage opposition(party B) didn't have vastly different approach when it comes to solving other problems (improving agriculture, education, healthcare or on how to create more job opportunities) which would have failed to convert subset of voters from their current stance.

    Also, as I see the above point could also explain the reason for the lower voter turnout in North and East. Both A & B didn't seem to have significantly different or new approaches when it comes to solving the prevailing problems of those areas. This would have led many from those areas to abstain from voting.

    * If B would have provided equal utility but by different means, then the voter concludes: "Okay, a vote for B is a vote for something to change, but a vote for A is a vote for no change." He then must evaluate whether change (generally) is a good thing. To make this evaluation, performance evaluations come into play. Based on the history he has seen of various parties governing in various circumstance, he asks himself, "How much utility would the ideal government have delivered me under the circumstances that A has governed in?" If A stacks up well in comparison, he votes against change (i.e. for A). If not, he votes for change (i.e. for B) and hopes for the best.

    Ending the corruption & wastage in government was the popular slogan of the Gen.Fonseka's campaign (party B). Here I believe again the voters were split into two sides based on the performance evaluations. Some thought Gen.Fonseka, who comes from a totally different background to political arena would surely bring in the change they want by getting rid of all the corrupted politicians. While, there were another bunch, who did look at the past (even considering the cases like Hitler, Idi Amin & etc) feared of he would turn into a dictator and decided to vote against the change. Apparently, based on the results it's evident the number belong to latter set was high.

    Finally, Downs mentions a very important statement in his writing:

    These decisions about utility, however, lack perfect information. He must estimate all these questions about utility based on the "few areas of government activity where the difference between parties is great enough to impress him". In other words, voters use information shortcuts;

    As I understand, for voters to use information shortcuts he must be able to get the true picture of the context. This is why existence of free, balanced & impartial media should really matter!

    ]]>
    Interesting stuff to watch out in 2010 http://laktek.com/2010/01/02/interesting-stuff-to-watch-out-in-2010 http://laktek.com/2010/01/02/interesting-stuff-to-watch-out-in-2010/#comments Fri, 01 Jan 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/01/02/interesting-stuff-to-watch-out-in-2010 We are already into the 2nd decade of 21st century and it is very evident that this would be the decade, where Cloud Computing, Realtime web and Mobile Web will start to rule!

    As per some buzzword fanatics, this year (2010) will be the transition year from web 2.0 to web 3.0. Buzzwords aside, as a web application developer I too expect to see the rise and mainstream adoption of some very interesting technologies during this year.

    Web Sockets

    Remember how AJAX changed the face of web in 2000? As I see Web Sockets would be the new AJAX of 2010. It's actually the next step from AJAX in improving the face of web. Web Sockets would allow two way communication between the browser and the server. With this, HTTP will no longer would behave as an stateless protocol. Actually, web sockets API comes as an upgrade to HTTP protocol in HTML5 specification, but already lot of Browser vendors and server developers has shown their interest and started adding support for the web sockets API. Currently, Google Chrome Beta version supports the cleint-side Web Socket API. In 2010, we can expect other browser vendors including Mozilla to support Web Socket API, enabling web app developers to come up with richer real-time user experience.

    HTML5

    Apart from WebSockets, there are lot of other interesting developments in HTML5 specification awaiting to get mainstream adoption in this year. Many new browsers has started to support audio/video elements, which would allow us to finally ditch dirty proprietary plugins (i.e. Flash, QuickTime & etc). Other interesting features in HTML5 specification includes offline data access, and geolocation API which would be really vital for improved user experience of mobile web. Google already utilise these features in their mobile web apps, which would give a big boost for the widespread adoption.

    Rails3

    Exactly after a one year from the announcement, we are finally getting to see the fruits of epic merge between Rails & Merb. Much faster, modular and extensible version of our favourite web framework is almost ready to be released as Rails3, within this year. To get more details on improvements in Rails3 please follow the blog series written by Yehuda Katz in Engine Yard blog.

    NoSQL Movement (Scehma-less Databases)

    Last decade, we only heard big boys like Google(Big Table), Amazon(Dynamo) are using schema-less key-value data storages. However, projects like MongoDB, Redis, CouchDB & Tokyo Cabinet is giving the opportunity for us to get a taste of it. Schema-less databases are proving to be really flexible over traditional relational databases for certain types of projects. NoSQL movement will surely gain more steam in 2010, so ignore it at your peril!

    Git

    You may wonder isn't Git already a mainstream technology from the last decade? It's true that its used to manage the world's largest FOSS project, Linux. But the real power of Git is beyond from a Distributed Version Control System. GitHub is becoming very popular, which is a business model entirely based on Git. Certainly, Git has opened up a new dimension in collaborative development and distributed file systems. I believe there are lot of other uses of Git as a simple CMS to mange your personal blog to distributed data mining of large projects. If you haven't checked out Git yet, I recommend you to add it as one of your todos for this year.

    Node.js (server-side javascript)

    Concept of server-side JavaScript is pre-dates back to 1990s, to the days where Netscape used it as a scripting language in their LiveWire servers. For two decades JavaScript couldn't extend its client-side reign in to server-side environments. However, the release of Node.js, which is an evented I/O for V8 javascript engine, has again made JavaScript a strong contender as a server-side development language. Node.js differs from traditional call-stack based frameworks by having a non-blocking API, which is strongly supported by callback based & evented nature of JavaScript. If you never cared to understand JavaScript and thought jQuery could save your day, now there are better reasons to dig deeper into the world's most misunderstood language.

    What are other fascinating technologies, you would keep an eye in this year?

    ]]>
    Can an Introvert be a better leader for Sri Lanka? http://laktek.com/2009/12/05/can-an-introvert-be-a-better-leader-for-sri-lanka http://laktek.com/2009/12/05/can-an-introvert-be-a-better-leader-for-sri-lanka/#comments Fri, 04 Dec 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/12/05/can-an-introvert-be-a-better-leader-for-sri-lanka This morning, I came across an interesting article in Forbes titled “Why Introverts Make the Best Leaders”. It gives some really good reasons why introverts could lead better than their extroversive counterparts, who are normally considered as the natural leaders.

    Being an introvert myself I know, majority of the people have the perception that introverts are just bunch of obnoxious people, whom they don’t like to interact and would never consider as leaders. Surprisingly, as the article points out world’s most successful business moguls, including Bill Gates and Warren Buffet are introverts. Even in Sri Lankan business context, there are successful personalities like Dr.Hans Wijayasuriya (CEO of Dialog), who I believe is an introvert.

    Can an introvert rule a country? They considers, Barrack Obama is somewhat an introvert. How about Sri Lanka? Apart from Late. President J.R.Jayawardena, I don’t see any of our leaders, who can even slightly fits into the "introvert" label. Maybe, an introvert can never gain enough popularity to be the leader, in our highly extroversive society.

    But I believe, an introvert would be better suited as the leader of our country. Here are some of the reasons, which makes me believe on it.

    1. As the above article points out, Introverts do think first, talk later. Compare this with our leaders, who spits all sorts of bullshit that comes to their tongue. Have we ever had a leader who could stand on his words?

      Actually, this is also the main reason why introverts have become unpopular in an talkative extroversive world. Introverts don’t talk much, but when they do, they really know what they tell.

    2. Introverts inspires and motivates themselves from their work, rather than the social popularity or material wealth. This gives them a better chance to achieve their goals.
    3. When it comes to decision making introverts are more intuitive. They do not make decisions out of feelings. Naturally, they got a better sense and self-belief on what they could do, and would not tend to reverse their decisions due to external facts.
    4. They are good listeners. By a good listener, it doesn’t mean anyone can say anything and influence them on the decision making. Introverts will know how to separate grain from the salt, because they do listen from the mind, not from the heart.
    5. Introverts has the ability to understand the capacity of his subordinates better. This will make sure the right people are used for the right task, rather than appointing people based on the personal relationships or trust (which has been a huge mistake).

      Also, our country needs to involve intellectuals actively for policy making, if we want to achieve a sustainable development. Most of the intellectuals are also naturally introverts. Introverts feel more comfortable to work with another introvert, than an extrovert.

    6. ]]> First Meetup of LK Ruby User Group http://laktek.com/2009/10/01/first-meetup-of-lk-ruby-user-group http://laktek.com/2009/10/01/first-meetup-of-lk-ruby-user-group/#comments Wed, 30 Sep 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/10/01/first-meetup-of-lk-ruby-user-group Last Wednesday (30th September 2009), the first-ever meatspace gathering of Sri Lankan Ruby Users was held at Ridgecrest Asia (Pvt) Ltd. There were more than 20 passionate, enthusiastic Rubyists filling the room and I would call it a promising start.

      For several years, myself personally knew only handful of Rubyists in the country. Though, we had shared the love for the language, we doubted whether we could anticipate large adoption of Ruby culture in Sri Lanka and ever have a active community going here. One of the main reasons for that was at that time there were no mainstream industry demand for Ruby. There were only couple of startups doing Ruby (Rails) based development and very few of developers had the freedom to choose their development toolbox by themselves. So someone choosing Ruby as their main language of choice was a rarity.

      However, in the last couple of years things have started to change. Globally, Ruby has received a mainstream adoption and success of Rails made it a de-facto consideration when it comes to web apps. This has made Sri Lankan developers and firms to think about Ruby more seriously. We've seen several new interesting Ruby based projects coming up, also, heard several firms considering to migrate their legacy code to Ruby. Overall, these are great signs promising some exciting times ahead to aspiring Ruby developers in the country.

      Unlike Java, .NET or other commercial mainstream platforms Ruby developers are not gauged through professional certifications or training programs. As Matz believed, people should be able to express themselves freely when programming. This is something that cannot be trained or teached, only way one could absorb these Rubyisms is through passion and practice. That's the key difference between a Rubyist and other commercial developers. But Ruby community believes in collective efforts and in helping each other to groom themselves.

      The main idea of forming a Sri Lankan Ruby User Group (LK-RUG), was to help the developers to be inspired. It is harder to be inspired while working in isolation, especially when you are starting to grasp things. A gathering like this could help the people to share what they learnt from their experiences, while learning few tips and tricks from others. Behind many great rubyists there is a community which helped them groom. I believe same could happen in this country too. And the very first meeting gave positive signs for that. It was informal, friendly and very enthusiastic gathering. Let's hope we could maintain the same spirit in the future meetings too.

      So, if you already hacks with Ruby or eager to learn about Ruby culture, join with the Sri Lankan Ruby User Group and participate in the future meetings.

      BTW, Here are the slides from my presentation on "Evolution of Rails", which was done in the first meeting.

      P.S. Special Thanks to Sameera Gayan, for coordinating the event and Ridgecrest Asia (Pvt) Ltd. for offering the the location for the meeting.

      Update (05/10/2009):

      Gaveen's thoughts on the meetup

      Flickr Photoset of the meetup (uploaded by Gaveen)

      ]]>
      Independent Thinking http://laktek.com/2009/08/08/independent-thinking http://laktek.com/2009/08/08/independent-thinking/#comments Fri, 07 Aug 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/08/08/independent-thinking Independent thinking and self-concious decision making is what builds a person and the society. Yet, it's the most discouraged, criticised and often punishable act that a person can do. Our culture has done a nice job by misinterpreting and abusing values - such as obedience, loyalty and teamwork to suppress the importance of independent thinking.

      This starts to happen from our birth, where parents tries to give extra protection and care all the time. They will not let their eyes off the child, would not allow the child to touch anything or to play at his desire. This may be purely unintentional and due to their excess love for their child. However, unconsciously when they continue this beyond the limits, they do actually harm their child by blocking the creative sense and opportunities for self-realization. Afterall, humans are not weak as we seem.

      As I discussed in the previous post, schools to exert enough pressure to kill off the rest of the independent thinking capabilities within a person. This will continue to happen until the exam oriented education structure vanishes and people realises the value of each other irrespective of the educational or social background he comes from. Sir.Ken Robinson nicely presents this point in this TEDTalk. Take some time to watch it, if you haven't seen it before.

      Things get worse, when you enter into higher education, where you expect the independent and critical thinking is to be fostered. You are guaranteed to have poor grades if you are to challenge or tries to explore beyond what is taught. Parrotised lecture notes should be vomited on the paper if you want higher grades (is it the lack of knowledge or envy is still a puzzle). With the beliefs of higher the GPA, higher is your salary, nobody doesn't seem to be bothered to diss the current knowledge system. These professionals are so vulnerable to change and would never encourage their sub-ordinates to change. This results with a legacy knowledge system that is incapable of solving today's problems.

      When it comes to politics, corporate business or any other form of community activities you see the obvious. There is very little room (or actually no room) for independent thinkers. You're assured to be sidelined, mocked, harassed and in worst case even to pay the penalty with your life, if you are to hold a different point of view from the so called majority (which is actually a minority, which has exploited the power and force to grab the blind following of the rest, who have been trained not to use their wit by the earlier systems).

      Just think about it independently ;)

      ]]>
      Ban Schools & Education! http://laktek.com/2009/07/28/ban-schools-education http://laktek.com/2009/07/28/ban-schools-education/#comments Mon, 27 Jul 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/07/28/ban-schools-education It's very sad and alarming to hear the recent incidents taking place at Sri Lankan schools. Government shows they are so concerned about these issues by banning everything they believe that can harm our next generation :) Mobile phones at all schools are banned, Web Sites with explicit content are banned, screening of "Adults only" movies are banned. Ok, now government could say they have taken all necessary steps to groom our next generation to be well disciplined citizens, and future our country is guaranteed to be prosperous.

      However, government and its so-called advisers will never realize the root causes of all these problems. Their short sighted decisions and floating policies from the past, have aggravated these problems to this level and none of their decisions would help to change the situation in the long run.

      I believe Sri Lankan education system is screwed in big time! Kids are thrown in to a rat race from the kingdergarden, when they don't even have a slight clue on where they are heading. Not to mention, even after going through all the steps of primary, secondary and tertiary education, more than 80% of them still don't have an idea why they ran all these years. It leaves a big question whether do we have to run at all?

      Aside from the spoon-fed knowledge only selfishness, insensitivity, jealousy and hunchback (after carrying a school bag of 4KGs) are the only gains of this current education system. Why Sri Lankan education system failed so miserably in building citizens with self-confidence that they are someone who is adding value to the society? Why can't someone be a janitor, carpenter, factory worker, farmer, dancer, sportsman or a doctor and still feel they are all equal in the society?

      This false social grading starts from the primary school admissions. It's only the kids of the rich and so called elites will be admitted to the popular schools. No matter how closer you live to the school, your child would not be admitted if you cannot afford to give a hefty donations to school's development fund or if you don't have enough civil power and political influence. From year 1 these kids starts associating only with a certain social layer and will never understand their is another way of life lower or higher than them. They will measure the quality of their lives relative to these layers. Basically, layer above them are the most superior, powerful and layer below them are the inferior, wretched. They will never understand all these social layers has their own mix of good and bad.

      Next biggest mistake is the mis-interpretation of aesthetics and extra-curricular activities in schools. You are not allowed to sing or a dance, unless you want to take part in Derana Little Star. If you are not a play cricket, if you cannot select for college XI. Talking about myself, I had no skill on any sport or aesthetics. Still, I went to football practices, bloody well knowing that I will never be selected for the college team. I participated in drama, dancing and singing practices for cultural day in every year, though I only got the chance to be on stage handful of times. Later I learnt it wasn't my talent. But looking back today, the experience and lessons learnt through those activities are impossible to gain by just sitting in a classroom. The negligence of extra curricular activities in schools is also a main cause of the unfortunate incidents we hear today. I know some schools cancelling Sports Meets, Cultural Days to finish the syllabuses on time. Can we call such places as schools?

      There are more stuff running through my mind, but I will stop this rant at here. What I want to stress is whether you are a govt. official, principal, teacher, parent, got a sibling or even a total outsider - please pay your attention to the root causes and be aware of what's really happening at schools. I'm sure none of you would want to hear more unfortunate incidents.

      ]]>
      Six virtues of being an IT undergrad... http://laktek.com/2009/07/08/six-virtues-of-it-undergrad http://laktek.com/2009/07/08/six-virtues-of-it-undergrad/#comments Tue, 07 Jul 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/07/08/six-virtues-of-it-undergrad If you were one of the early readers of my blog, you would know that I was selected to do my Bachelors in IT at Faculty of Information Technology in University of Moratuwa. It's hard to believe 3 years have gone pass in a flash, but reminiscing what I gathered during this period it feels really awesome.

      From my young age, I had the passion to play computers and internet. Those days I used to day-dream of building all sorts of awesome products and believe me I still have some of the pen-sketches of those ideas. When it came to Ordinary Levels I had already got the opportunity to do some work with local web development companies (especially with awesome folks at E-Fusion, the people who did kaputa.com - which became the trend setter of Sri Lankan portals). So my initial idea was to say good bye to formal education from O/Ls and get into work in IT for full time. I even had some friends at school, who were ready to work with me on a startup. However, my mentor at that time Mr.Niranjan Meegamana of E-Fusion, influenced me that I should continue with my secondary education and should pursue a degree in IT, if I really want to have a long journey in the industry. That motivation directed me to end up in selecting to Sri Lanka's only national degree course for IT.

      Initially I had few doubts how the things would turn up, mainly due to all the crappy stuff I had heard and seen about local universities. However, things turned out to be really peaceful within IT faculty and University of Moratuwa. There were zero-interruptions for the course and third-party influences were very much less. Honestly, I believe getting in to the IT faculty was one of the best things happened in my life and it opened me lot of opportunities to reach my ultimate goal.

      I thought of sharing some of those experiences and highlights hoping it would help to inspire someone else.

      1. It's Free

      Thanks to free education structure in Sri Lanka - I'm privileged to do my degree course for free, which would have cost more that USD$ 10-15K if I did it from a university in other country or in a private institute. Coming from a average middle-class family, I'm really happy that I could continue my higher education without being a burden for my parents.

      2. Converted to FOSS

      When I stepped into IT faculty, I was not a hardcore advocate for FOSS. But within few weeks due to the influences of FOSS advocates at the faculty - like Prabhath, Mifan, Anushke and Amila, I too converted to be a FOSS purist. Since then I haven't looked back and today free software culture has become an integral part of my life. Joys and benefits reaped of free software culture, fits into separate post. So will share more on that in future.

      3. Internship at a startup as a freshman

      As Paul Graham says "The way to learn about startups is by watching them in action, preferably by working at one". I got this opportunity from my freshman year. Prabhath, who happened to be my mentor and role-model at the university, invited me to join with them at Vesess. The inspiration, motivation I gathered just by watching how they work was immense. The experience you gain from the challenges at a startup cannot be matched by any other learning experience. Not only you get to solve problems that will matter in real life, you will see how people use what you build. Our product for online billing - CurdBee today has become one of the most essential apps for freelancers.

      4. Google Summer of Code

      Google Summer of Code program is a program hosted by Google to encourage university students around the globe to contribute to Open Source software. There is a strong interest for this program in the University of Moratuwa (In 2008, it happened to be the top university in the world, with most accepted students for GSOC). I also had the opportunity take part in GSOC in 2007, where I worked on Silverstripe CMS Framework. I successfully completed building a Mashups module for Silverstripe and released my work for public use.

      5. Opportunities to network

      As I believe most important stuff you learn at university is learnt out of the lecture room. I had the chance to listen and talk with lot of amazing people, who were from different walks of the life. Some of them were the visiting lecturers, seniors, own batchmates and some were just random dudes who accidentaly caught up to a little chat at the canteen while having a tea. No matter who are they, listening and sharing thoughts with them helped me to expanded my perspectives and change my attitudes towards life. During our usual after lunch banters - we go through unimaginable number of topics like music, cricket, geekery, hacking, farming, environment, oil crisis to politics to women, sex to religious philosophies. It's truly amazing what a lot of knowledge and experiences can be shared when such highly diversified group of people get together.

      6. Lurking on the internet gives you the fringe

      In IT you rarely get to parrotize long formulas or boring theories. You need not to burn yourself doing field researches, no need to waste your time with boring practicals. No need to run after seniors for 'kuppis' during the exam time. All you have to know is how to use Google and Wikipedia to get through all the academic stuff. I have sat for exams without having a single note and only reading Wikipedia. If you have a little itch to read more on a subject and keep yourself with the latest trends, you will have a fringe benefit over others. I don't think there could be any better academic course than IT, for a internet addict like me.

      It's not the qualification you gain from an academic degree course that matters, but the exposure, opportunities and experience you gain during the journey, will shape your future.

      ]]>
      Sticking to Basics http://laktek.com/2009/07/07/sticking-to-basics http://laktek.com/2009/07/07/sticking-to-basics/#comments Mon, 06 Jul 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/07/07/sticking-to-basics I love Test Cricket. I believe it has very close resemblance to our real lives, and it gives so much of inspiration. Sri Lanka's remarkable victory today against Pakistan after being at the jaws of defeat, yet another classic example for that. Pakistan were dominating in first 3 days of the game and it was almost certain they have sealed the victory by end of yesterday. Yet, within just one unfocused, carefree session of play they did let go all the good they did in past 3 days.

      On the other hand Sri Lankans, went to field today with a glimpse of hope of a victory. They needed to bowl out Pakistanis before within a mere target of 97 runs. Given the strong batting line up of Pakistan, it seemed daunting task and seemed only a miracle could reverse the result. But Sri Lankans did win comfortably at the end! without any miracles or magic. Sri Lanka's match winning bowlers - Murali, Vaas or Malinga was not even in the playing XI. Ajantha Mendis, the only trump card for Sri Lanka, did only had an ordinary game. So what changed the game? It was the hardwork of 3 average bowlers - namely Herath, Thushara and Kulasekara. They bowled with discipline and skipper Sangakkara kept the trust in them and rightfully exploited the opportunities.

      Pakistan skipper Younis Khan, very correctly explained what cost them the game - "When I am under pressure, I go back to my basics. They need to go back to basics too. Break it down into small-small sessions, be it batting, be it bowling, be it fielding. It's only a six-hour day, it shouldn't be that difficult."

      Isn't this what happens in our lives too? When things are going fine for us we tend to neglect our basics. Also, when things go wrong we do experiment and seek all sorts of other fixes - but forgetting to return to our basics. Today's game teached us the value of sticking to our basics. That's the secret mantra for success!

      ]]>
      [Rails Tips] Reduce Queries in ActiveRecord with :group http://laktek.com/2009/06/13/rails-tips-active-record-querying-be-smart-and-vigilent http://laktek.com/2009/06/13/rails-tips-active-record-querying-be-smart-and-vigilent/#comments Fri, 12 Jun 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/06/13/rails-tips-active-record-querying-be-smart-and-vigilent I thought of sharing some tips in Ruby on Rails development, which would come in handy especially if you are a newbie. The cornerstone of all of Rails' magic is ActiveRecord. As you know it's an ORM, which hides all cumbersome and mundane SQL by a syntactic sugar coating. However, blind and lazy usage of ActiveRecord could really hurt your application's performance. I found this particular instance when revisiting code of an app, I have written in my early days of Rails. As a newbie overdosed with ActiveRecord's magic, had written a blunt piece of code which looks horrible and also would make the app painfully slow.

      1550 items in total (1350 Available, 150 Out of Stock and 50 Discontinued)
      This was the expected summary output. At surface, displaying such a block seems trivial, right? I had the following in the view:

      <%= @all_items_count %>; items in total (<%= available_items_count %> Available, <%= out_of_stock_items_count %> Out of Stock and <%= discontinued_items_count %> Discountinued)

      Then in the controller, I have explicitly assigned the ActiveRecord query results all four variables. ('acts_as_state_machine' plugin provides the count_in_state method.)

      def index
         @all_items_count = Item.count :all
         @available_items_count = Item.count_in_state :available, :all
         @out_of_stock_items_count = Item.count_in_state :out_of_stock, :all
         @discontinued_items_count = Item.count_in_state :discontinued, :all
      end

      Technically this works. But can you spot the issue here? Lets get to terminal and inspect the app's logs. When rendering this action, it sends a query to database to get each of the four values. Database queries are costly and will cause a slow response.

      Better Solution?

      If we could reduce the number of database queries this action would be very effective. So is there a way we could reduce the number of Database queries? Remember that we can group the results of an SQL query? You can specify the :group option in ActiveRecord's query methods. I modified the previous code to pass the :group option to the query:

      def index
         @items_in_state = Item.count :all, :group => "state"
      end

      Now, instead of four queries we are making only one database query. With the group option passed, ActiveRecord will return the results in the form of a Hash. So we can now grab the count of items in each state. Lets change our view to adapt to the changes we did in the controller.

      <%= count_items("all") %>; items in total (<%= count_items("available") %> Available, <%= count_items("out_of_stock") %> Out of Stock and <%= count_items("discontinued") %> Discontinued)

      Here, I used a simple helper method called count_items to make it more elegant. Here is what goes in the helper:

      def count_items(state)
        return @items_in_state.inject(0) { |count, s| count + s[1] } if state == "all"
        @items_in_state.fetch(state, 0)
      end

      To return the total count, we could use <a href="http://www.ruby-doc.org/core/classes/Enumerable.html#M003171">inject</a> method, which would iterate through the hash to take the sum. Also, keeping the basics in mind, we should index the database fields which is queried regularly. In this case, it is better to index of the state column of the table. Mistakes like this are very obvious and could be easily avoided if you do the things with a sense. However know the trade-offs, always keep an eye on what's happening behind at the backstage. Don't let the faithful genie to turn into a beast.

      ]]>
      Ruby Best Practices http://laktek.com/2009/04/14/ruby-best-practices http://laktek.com/2009/04/14/ruby-best-practices/#comments Mon, 13 Apr 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/04/14/ruby-best-practices First of all sorry for letting this space go on a hiatus yet again.. Though I tried to make it a habit of posting regularly, other priorities didn't allow me to do it as I wish. In last few months I had to run through lot of challenges in real life and in hacking, which I feel would be worthy to share. I promise that I will start posting them soon.

      Meanwhile, I'm glad to inform you that I will also be contributing to Ruby Best Practices Blog, which is a collaborative effort organized by Gregory Brown of Prawn fame. Rest of the core team of RBP includes well experienced and interesting developers such as, James Britt, Kirk Haines, Robert Klemme, Jeremy McAnally, Sean O’Halpin and Magnus Holm. So if you are passionate in writing smart and robust Ruby code, you would really enjoy this blog.

      Visit Ruby Best Practices blog.

      ]]>
      Get mocked! http://laktek.com/2009/02/25/get-mocked http://laktek.com/2009/02/25/get-mocked/#comments Tue, 24 Feb 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/02/25/get-mocked My first few days doing Maths in Advanced Levels was a nightmare. I usually sucked in Maths in high school and only because of my strong passion for IT made me do Maths (thats the only way in Sri Lanka to gain higher education in IT). I couldn't grasp a single shit, other than knowing some greek characters on the board. The tutor was ruthless, and sarcastic at his best. To make the matters worse, the class was full of opposite sex, who were awaiting to LOL at any insult thrown. I just sounded total dumbass!

      Tutor said I will not go beyond a simple pass even if I work my ass off. However, after two years of hard work, I proved him wrong by entering to University of Moratuwa, and pursue my childhood dream of a career in IT. It was the mockery at the class, made me strong and motivated me to bring out the best in me.

      It's natural to feel humiliated and give up when you get mocked by others, but try to turn them into your own advantage. Don't try to avert them or defend them. Let them mock you! Just keep believing in yourself and stick to what you do!

      ]]>
      Professional Education is Bullshit! http://laktek.com/2009/01/16/professional-education-is-bullshit http://laktek.com/2009/01/16/professional-education-is-bullshit/#comments Thu, 15 Jan 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/01/16/professional-education-is-bullshit Most of my university colleagues has this craze of following various professional education courses and certifications. They call it CIMA, BIT, BCS, ACS, SCJP, CCNA, MSDN, PMP and the list goes on. I don't get the rationale behind this, What's the benefit of having all these qualifications? What exactly you gain by spending such hefty amounts of money for these courses? Is it because you think you could decorate your CVs with all these bullshit? Or is it just for the sheer pleasure of seeing random Latin characters printed after your name?

      As I know most of these courses focus on single line of technology or certain set of standards, which has no guarantee to be relevant in another 2 years time. Also, comparing with the academic education these courses doesn't offer much diversity and depth either. You could easily get through these exams by parroting the mock question bank and puke it all at the exam. There is very little chance of anything retaining and absorbing to yourself.

      The reality is those qualifications or the grades itself won't make you brilliant. Those will just take you far away from the reality. It will give you and world that you are a qualified professional. But in reality, most of these people struggles to get things done and fails miserably at the real targets. Bill Gates, Steve Jobs, Sergy Brin and Larry Page they all have one thing common. All of them had to dropout from their academic careers, to reach their destiny. That doesn't mean you need to dropout to make a difference. However, what is evident is that the stuff you do, the challenges you meet, the problems you solve and the experiences you gain during your academic career, are what makes you different from the rest of the stack.

      Inspire yourself to gain some real life opportunities. Oragnize a Gig, Do a research (I mean a real one- not those tomfoolery), Start contributing to an open source project or create your own startup. Find something which matches your passions and engage with it. Don't just waste your precious time and money, by blindly running after professional qualifications that won't worth a shite.

      ]]>
      Belated Resolutions http://laktek.com/2009/01/11/belated-resolutions http://laktek.com/2009/01/11/belated-resolutions/#comments Sat, 10 Jan 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/01/11/belated-resolutions It is almost mid January and this is my first post for the new year. I'm looking at 2009 to be a switching year in my life. The days of adolescence have almost comes to its' end and real life responsibilities seems to be creeping up. The days for bullshitting are numbered, and "I'm still learning" and "I'm still trying" will no longer be accepted as excuses. So it's the high time to get real and face the music.

      Here is what I will be looking to do this year (in no particular order)

      • Fix my sleeping pattern (and cut it short to 5 hours)
      • Have a regular workout routine.
      • Turn CurdBee into a solid web platform.
      • Keep on Blogging
      • Launch hackruby.com
      • Toastmastering for real (and be a competent communicator)
      • Horticulture
      • More Driving (only if I could fill my gas tank)
      • Avoid becoming an RSS/Twitter junkie
      • AND finish off this Degree, which sucks in big time!
      ]]>
      Ruby Advent Calendar http://laktek.com/2008/11/30/ruby-advent-calendar http://laktek.com/2008/11/30/ruby-advent-calendar/#comments Sat, 29 Nov 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/11/30/ruby-advent-calendar Inspired by the previous incarnations of Ruby advent calendars, I thought of running one for this year. For those who are unaware, idea is simple. To serve an interesting article on Ruby, each day for the first 24 days of December.

      We saw lot of cool stuff in the Ruby throughout the year, so it would be nice to reminisce on those and learn some new tricks for the year coming ahead.

      Like to Contribute?

      I really love to have your contributions. You could share almost any stuff related to Ruby culture. Please feel free to let me know (lakshan at web2media dot net) if you like to contribute. Let’s Make this fun and knowledgeable.

      Bookmark and Follow

      First post will go up tomorrow (Monday) morning. Be sure to bookmark! - http://advent2008.hackruby.com

      You could follow the daily updates via Twitter - rubyadvent

      Articles

      ]]>
      I want to be a Ruby Hacker... http://laktek.com/2008/11/21/i-want-to-be-a-ruby-hacker http://laktek.com/2008/11/21/i-want-to-be-a-ruby-hacker/#comments Thu, 20 Nov 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/11/21/i-want-to-be-a-ruby-hacker In the past couple of months, I heard this from number of my friends. I hope there will be more joining the club in the coming days. For the benefit of the freshers, I thought of sharing some tips I learned about Ruby hacking (though it may sound as obvious to many) .

      1. Write something that scratches your itch - Rather than blindly following some tutorial someone has written, try to solve one of the problems you have (such as simple todo list) and try to grasp the concepts during the process.

      2. Learn to read source code - Reading code is one of the best ways to learn how to code from great programmers. Especially in Ruby, the syntax is very easy to comprehend, so you could read code as you're reading a fairytale. If you are looking for a good book to begin with Ruby idioms, I recommend Why's Poignant Guide for Ruby, which inspired me to learn Ruby.

      3. Pragmatic Programming approach and Agile development is the way to go - If you are still accustomed to write a SRS and draw UML diagram before you begin to code, then you will not feel comfortable with the concepts of Ruby. Find and read Pragmatic Programmer of Andy Hunt and Dave Thomas, it will help you to refresh for a good start.

      4. Use a text editor - If you are coming from .Net or Java environments, you may be so obessed to use an IDE. But adopting to a lightweight text editor such as Emacs, Vim, Textmate (Mac only) or Gedit is a bliss. Because Ruby is all about crafting code, not dumping auto-generated piles of shit.

      5. Learn Git - Ruby community loves Git, for source code management. Many Ruby frameworks, gems and plugins use Git as their SCM. Besides Git can make your development workflow more flexible, productive and reliable. You may like to bookmark GitHub. Don't complain me if you are browsing it more time of the day than Facebook in 6 months.

      6. Use a nix Operating System - Ruby and related tools plays really nice in Linux, Mac and in other nix based Operating Systems. That doesn't mean it's not supported in Windows, indeed it is. But if you are looking forward to a deep dive in Ruby you will feel more comfortable with a *nix based environment.

      7. Don't live under a rock - Ruby community is very fast paced. There are lot of new plugins, gems and frameworks coming up every week. Also lot of best practices and useful tutorials are being blogged. So it is better to follow some blogs and podcasts to keep yourself updated. Personally, I recommend RubyInside, RubyFlow and RailsEnvy podcast to catch the best.

      8. Bonus Tip: Learn to be "Passive Aggressive" - You will need to understand passive aggressive behavior and practice it for your defence. Even doctors say it's good for your health :P

      ]]>
      If Rails is a Ghetto, Merb is a Whorehouse http://laktek.com/2008/11/20/if-rails-is-a-ghetto-merb-is-a-whorehouse http://laktek.com/2008/11/20/if-rails-is-a-ghetto-merb-is-a-whorehouse/#comments Wed, 19 Nov 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/11/20/if-rails-is-a-ghetto-merb-is-a-whorehouse Don't get confused over the title, I'm not trying to punch the Merb community as Zed did to Rails. A grasshopper like me don't even qualify to do such a rant. I'm actually trying to pimp Merb!

      Merb is the newest addition to Ruby town. First, it was started just to satisfy the unfulfilled desires of some homies who lived in Rails ghetto. Soon they found Merb, as a sweet spot they could refresh and relieve the pressures they had in Rails. Hush-hush about Merb was spreading so fast, and it had it doors open for everyone from day one. However apart from the hustlers, many others backed-off mainly due to spread of FUD. In the midst of all the hate-games Merb had turned 1.0 and even Matz, the Godfather of Ruby Town, has given his thumbs up for Merb. It is no more a dark ally and it's here to stay.

      It's your Call

      Not all want bang with big Racks, your taste maybe for micros. You may want to roll with a mature like ActiveRecord or maybe you love to do a tenderly Sequel. How about relaxing in Couch? Nothing to be embarrassed, Merb knows how to satisfy you all alike. Just do it in your style, in Merb you could even cum_later run_later. Did I tell you, that you could bring your own toys (meh, slices) to Merb?

      Could I exposed to STD?

      Many fear that Merb will bring STD(Stupid Terrible Dependencies) to apps and systems. In fact this was a PITA in the early days of Merb, but with the power of Thor and new bundling strategy will help you to take care of yourself better.

      If you yearn for some real fun and action, now it's time to head over to Merbhouse!

      ]]>
      Extended-Bort: My base Rails app http://laktek.com/2008/10/31/extended-bort-my-base-rails-app http://laktek.com/2008/10/31/extended-bort-my-base-rails-app/#comments Thu, 30 Oct 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/10/31/extended-bort-my-base-rails-app Bort is an awesome base Rails app, which allows you to get into real action without wasting your time on setting up the most common and boring stuff. It comes with RESTful Authentication, OpenID support, Capistrano Mutli-stage deployments and many other essential plugins, thus lifting good work load. I first got to use Bort when developing MyConf for Rails Rumble, where agility mattered to the maximum. Since, I felt it would be ideal to use Bort as the cookie cutter for my future Rails apps as well. However, I felt there needs to be several changes to make it more ideal for my workflow. Hence, I forked Bort and came up with Extended-Bort!

      What are the changes?

      • Git Submodules are used to keep Rails and other plugins updated.
      • Included Rails 2.2.0 with tha app
      • Added annotate-models and make_resourceful plugins
      • Added Action Mailer initializer and SMTP settings for production mode
      • Uses admin email specified in settings.yml in exception notifier
      • Replaced rSpec story runner with new Cucumber Scenario Framework (webrat and cucumber plguins are included)
      • Replaced Prototype js with jQuery
      • Replaced asset_packager with bundle_fu for bundling assets
      • Changed Stylesheets by adding an initial stylesheet, application stylesheet and Hartija CSS print stylesheet

      Want to Use?

      If you feel like using Extended-Bort, follow these steps:

      bash git clone git://github.com/laktek/extended-bort.git git submodule init git submodule update

      Edit the database.yml and the settings.yml files

      bash Rake db:migrate

      change the session key in config/environment.rb and REST_AUTH_SITE_KEY in environments config (you can generate keys using rake:secret)

      Have a brew and celebrate (from original Bort guys, but you can still do it ;) )

      ]]>
      Really Simple Color Picker in jQuery http://laktek.com/2008/10/27/really-simple-color-picker-in-jquery http://laktek.com/2008/10/27/really-simple-color-picker-in-jquery/#comments Sun, 26 Oct 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/10/27/really-simple-color-picker-in-jquery Recently, I needed to use a color picker with predefined color palette for my work. Thanks to many enthusiastic developers, there are several popular, sophisticated color pickers already exist for jQuery. However, most of these plugins looks complex as if they are made to be used in a online image editor. They are overwhelming for simple usage and less flexible for customization. So I had to write my own simple color picker from the scratch.

      Usage of color picker is very straightforward. Users can either pick a color from the predefined color palette or enter hexadecimal value for a custom color. Compared to other plugins, it is very lightweight (it's only 5KB without compressing) and obtrusive to use. It doesn't require any dependencies apart from jQuery core and uses simple HTML/CSS for presentation. You have the ability to easily customize the default color palette by adding more colors or replacing the palette with completely different set of colors.

      Want to try?

      If you want to see a demo before trying out by yourself, here is a simple demo of the plugin.

      Download Color Picker via GitHub

      Usage

      Color Picker requires jQuery 1.2.6 or higher. So make sure to load it before Color Picker (there's no other dependencies!). For default styles of the color picker load the CSS file that comes with the plugin.

      <script src="jquery.min.js" type="text/javascript"></script>
      <script src="jquery.colorPicker.js" type="text/javascript"></script>

      Add a text field to take the color input.

      <div><label for="color1">Color 1</label>
      <input id="color1" name="color1" type="text" value="#333399" /></div>

      Then call 'colorPicker' method on the text field when document loads.

       jQuery(document).ready(function($) {
          $('#color1').colorPicker();
        });

      Your favorite colors are missing? Just add them to the palette

        //use this method to add new colors to palette
        $.fn.colorPicker.addColors(['000', '000', 'fff', 'fff']);

      Or completely change the color palette as you need...

        $.fn.colorPicker.defaults.colors = ['000', '000', 'fff', 'fff'];

      That's all you have to do!

      Future Improvements

      This is only the initial release of Color Picker. There may be wild browser bugs or you may find smarter ways to improve the functionality of the plugin. I'm open to all your suggestions and complaints. Leave a comment here or contact me directly.

      Further, the code of the plugin is available via GitHub, so if you feel like forking it and playing with it please do!

      Update

      Plugin now supports all major browsers including IE6! (Thanks muser for the patch)

      Update #2 (October 14, 2009)

      Color picker will automatically update it's color when the value of the input field is changed externally. (Thanks John for initially identifying the issue and Sam Bessey for pushing me on to this change :) )

      Update #3 (February 17, 2012)

      Made a significant change to support transparancy and other additional options. Special thanks for the contributions from Daniel Lacy. Please refer the README for more details.

      ]]>
      Love what you do http://laktek.com/2008/10/15/love-what-you-do http://laktek.com/2008/10/15/love-what-you-do/#comments Tue, 14 Oct 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/10/15/love-what-you-do In "Maybe you can't make money doing what you love", Seth Godin challenges the conventional wisdom about careers. He argues sometimes converting your passion in to a profession may not work. Though most of us may not willing to accept, this is the harsh reality of life. Rather than doing what you love, you have to begin loving what you do.

      No matter whether you are employed in a large organization or running your own startup you cannot expect to work only on what you love. If you are in a larger organization you may have to bow to your boss and do whatever he orders, if you need to secure your paycheck. Though you have creative and innovate ideas, going over the organizational restrictions and policies may not be easy. You may think if you were running a startup you would have freedom to go for your heart desires. However in reality it's also not that sweet as it sounds. If you want your startup to make profits and have a steady cash flow, you always have to operate with scarce resources. So you will need to go beyond what you love to do.

      Imagine bunch of kickass coders working for towards creating the next killer web app. If the app's interfaces are confusing, servers are clunky, support is poor and the business model is vague, it will be in deadpool in 2 weeks time. In reality the code is only 10% of the whole mission. In a startup, you need to match highly differentiating set of tasks by yourself. You cannot avoid anything saying that's not my cup of tea.

      You cannot expect the world to spin the way you want. You have to embrace whatever comes to you and turn them into your own good. If you wait till your perfect time comes it will be too late. Secret mantra of many successful people is their multifaceted characters. Are they born with these skills gifted? In my belief they have developed these skills by loving what they had to do.

      No matter how much passions you have, if you don't know how to market, how to face to the challenges and how to create opportunities there's very little chance you could be really benefit from them.

      ]]>
      Startups and Real Life http://laktek.com/2008/09/28/startups-and-real-life http://laktek.com/2008/09/28/startups-and-real-life/#comments Sat, 27 Sep 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/09/28/startups-and-real-life Yesterday I participated in the first Academic Symposium organized by our faculty. The topic for the day was Graduation, Entrepreneurship and Success. The panel, which was modereated by Peter D'Almeida (he is also an old Benedictine) included Dr.Sanjeeva Weerawarna, Wegapitiya (Laughs), Harsha Purasinghe (Microimage), Ramesh Shamuganathan (JKCS) and Mohammed Azmeez (Concept Nursery). The discussion raised some insightful and interesting thoughts on passion, ideas, startups and funding.

      However, as many other startup related discussions, it paid very little attention on the social factors involved with running a startup. In my perspective it's the most challenging aspect, than how to build the product or how to raise funds.

      They stressed on the point that you should only pay attention to your passion and work continuously to achieve success. Basically it should be your top priority. What does this means? You have to leave behind your family, spouse and friends to drive your goals. One day you may be in Fortune500 list, but can you be satisfied with your life after neglecting your near and dear ones? Can this way of life is happy and responsible?

      From my little experiences what I see is if you are to solve real world problems you have to live in the real world. This is a point where most of the tech companies have gone wrong. For them the real world sucks (as in Jerry Seinfeld and Bill Gates commercial), but the truth is you cannot understand it only by logic, theories or research. If you are to understand the real world you have to live in there.

      If you become detach from real world and be isolated to work, you are producing imaginary products. You are forced to believe world would embrace any stupid idea you throw at them. Tech blogs may call your idea a Paradigm Shift and VCs are ready to invest lucrative amounts of money. But unintentionally you're taking the world away from the reality. You should not be surprised if someone thinks he is safe from peak oil and food crisis because he has enough oil wells and fields in Second Life. However, the reality is there are more basic problems in this world which never even caught the attention of this so called web 2.0 space. To find these problems you need not to go to other corner of the world. These needs are already within our everyday life.

      So don't let startup fantasies to ruin your real life.

      ]]>
      Simple command line todo list http://laktek.com/2008/08/28/simple-command-line-todo-list-manager http://laktek.com/2008/08/28/simple-command-line-todo-list-manager/#comments Wed, 27 Aug 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/08/28/simple-command-line-todo-list-manager They say pen and paper is the best way to manage a todo list. Following the popular norm, I also started tracking my todos with pen and paper. But after several unsuccessful attempts of finding a pen or deciphering tasks from torn or soaked paper, I felt keyboard and the pixel-screen would be more accessible than the traditional method. Without relying on any sophisticated applications I relied on the most simple form - plain-text. It was better, but after being inspired from efforts like todo.sh and todo.pl, I thought of coming up with a small command line utility of my own.

      Enter todo gem!

      This resulted in coding my first ruby gem - todo. It is just a simple command line utility for managing todos. I didn't want to loose the flexibility offered by plain-text lists, hence it will use human readable YAML form, to store the lists. This enables you to use your favorite text editor to edit these todo lists. Further it supports tags. You could list the tasks by tags, thus making things smart and easy.

      Todo gem will run specific to a directory. This enables to keep different task lists in each directory. For example, I keep a todo list in my 'home' directory which holds all my housekeeping stuff and then I have separate lists for each project I work on in their respective directories. This separation allows better organization of the things to be done.

      Example

      Here is a basic example of how todo gem works:

        #visit your project folder
        cd projects/newapp
      
        #create a new todo list for the project
        todo create
      
        #add a new task
        todo add "write the specs"
        - add tags : important, due:24/08/2008
      
        #listing all tasks
        todo list --all
      
        #listing tasks tagged 'important'
        todo list --tag important
      
        #removing a task by name
        todo remove "write the specs"
      
        #removing a task by index
        todo remove -i 1

      Get it!

      To install todo gem in your machine simply run;

      sudo gem install todo

      Also the code of the gem is hosted in github, so you could fork and flip it in the way you want (and don't forget to send me a pull request).

      ]]>
      Contact Form using Merb & DataMapper http://laktek.com/2008/07/28/creating-a-contact-form-using-merb-datamapper http://laktek.com/2008/07/28/creating-a-contact-form-using-merb-datamapper/#comments Sun, 27 Jul 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/07/28/creating-a-contact-form-using-merb-datamapper Major benefit of using Rails to develop web applications is its smart conventions, which promote developers to adapt to common patterns and avoid wasting time in reinventing the wheel. Rails is strictly bonded with a database and has a pre-defined directory structure. Though these conventions are helpful in common cases, it reduces the flexibility of its use in custom contexts.

      If you are looking for a configurable yet smart framework, you should try out Merb. Approaching to version 1.0, Merb is quite stable. I found it's quite productive for lot of cases where Rails would've been too bulky. Using flexible ORM layer DataMapper, with Merb makes the things even more simple.

      Here is a simple example of using Merb and DataMapper to implement a Contact Form. Contact forms are used to mail the user feedback to site administrator. Generally, there's no need to store the information in a database. However it may be useful to do validations and Spam detection before sending the information. This can be implemented with Merb, by creating a flat application which would only have a single file and a configuration directory. Further DataMapper can be used to perform validations without creating a database.

      Getting Started

      First of all, you need to install Merb and DataMapper gems. I recommend you to install from the latest edge releases. This guide will show you how to do that.

      Now lets create a new Merb app.

      merb-gen app myapp --flat

      By issuing directive "--flat" generator will create a flat file structure for the app. Basically, it would be two directories for configurations(config) and views, along with the application.rb, which will contain the code.

      Configurations

      Add the following stuff in 'config/init.rb' to prepare the application environment.

      First we'll define the routes. Similar to Rails, Merb also supports RESTful routes. We'll create a singular resource named 'contacts', to map to Contacts controller.

      Merb::Router.prepare do |r|
        r.resource :contacts
      
        r.default_routes
      end

      We need to have Merb Mailer, Merb helpers (form controls, etc), DataMapper-Core and DataMapper-Validations gems for this app. So lets define those dependencies.

      require "merb-mailer"
      dependency 'dm-core'
      dependency 'dm-validations'
      dependency "merb_helpers"

      Next, Specify the SMTP settings of your mail server, which would be used to send the mails.

      Merb::Mailer.config = {
          :host   => 'mail.example.com',
          :port   => '25',
          :user   => 'test@example.com',
          :pass   => '',
          :auth   => :plain,
          :domain => "example.com" # the HELO domain provided by the client to the server
        }

      Implementation

      Now let's move into 'application.rb' where we do the essential stuff.

      Let's define a Contact class, which will be the DataMapper model. In configurations we didn't setup a database connection for DataMapper since we are not going to store information. So DataMapper will use the default Abstract adapter. This is good enough to provide ORM behavior without persistence. In the model we have defined the fields and the required validations. DataMapper offers smart validations such as "validates_format :email, :as => :email_address", which saves you from juggling with regular expressions.

      class Contact
        include DataMapper::Resource
      
        property :id, Integer, :serial => true
        property :name, String
        property :email, String
        property :message, String
      
        validates_present :name, :email
        validates_format :email, :as => :email_address
      end

      Next we have to setup the controller. Our controller will have two actions - show and create. Restful routes will match GET requests with the path '/contacts' to the show action. Similarly POST requests with the same path will be mapped to the create action.

      class Contacts < Merb::Controller
        def show
          @contact = Contact.new
          render
        end
      
        def create
          @contact = Contact.new(params[:contact])
          if @contact.valid?
            send_info
            render "Thank you."
          else
            render :show
          end
        end
      
      private
        def send_info
          m = Merb::Mailer.new :to => 'lakshan@web2media.net',
                               :from => @contact.email,
                               :subject => 'Contact Form Results',
                               :text => "#{@contact.name} (#{@contact.email}) wrote : \n #{@contact.message}"
          m.deliver!
        end
      end

      We defined a private method 'send_info' to handle the mailer functionality.

       <%= error_messages_for :contact %>
       <% form_for :contact, :action => url(:contacts) do %>
            <div><%= text_control :name, :label => 'Name' %></div>
            <div><%= text_control :email,  :label => 'Email' %></div>
            <div><%= text_area_control :message, :label => 'Message' %></div>
            <%= submit_button 'Send' %>
       <% end %>

      These form helper tags comes with the merb_helper gem.

      Done !

      We have done the coding, now lets see how it works.

      To start your app, in the terminal go to your app directory and run the command merb. Then direct your browser to 'http://localhost:4000/contacts' (normally merb loads in port 4000). Check whether it works correctly.

      That's all folks! Isn't Merb sounds great?

      I have uploaded the complete source of the sample to github - http://github.com/laktek/contact-form.

      ]]>
      Passenger - Holy Grail for Ruby Deployment http://laktek.com/2008/07/17/passenger-holy-grail-for-ruby-deployment http://laktek.com/2008/07/17/passenger-holy-grail-for-ruby-deployment/#comments Wed, 16 Jul 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/07/17/passenger-holy-grail-for-ruby-deployment Popular notion about developing web apps with Ruby on Rails (or other Ruby frameworks) was "You can write a web app in 15 minutes, but it will take 15 days to deploy it correctly". Especially if you are coming from the world of PHP, where you have to just write the app and upload it, this might have been utterly confusing. Having to juggle with Apache/Nginx, Mongrel clusters, or FastCGI, just to deploy a simple web app would surely have sounded a nightmare. You must have wondered countless number of times why doesn't it just work as in PHP ??

      Well, finally you're prayers have been answered! Enter Phusion Passenger( a.k.a mod_rails or mod_rack), a module for Apache web servers, which will make deploying ruby app just a breeze. Yeah, like in good old PHP now you can now just upload your app and you're live! voila!

      Thanks to Passenger, now you can use the same Apache web server where you hosted your legacy PHP apps, to deploy your Ruby apps. Also, if you cannot afford to pay for a VPS, you can even use a cheap Shared host such as Dreamhost (which already supports Passenger). If it was the hassle of hosting the apps, which prevented you from developing in Ruby apps, it should not be a problem anymore.

      So how I get up and running?

      Installing Passenger is simple and the process is unbelievably user-friendly. All you have to run is just two commands in your terminal. (I assume you have installed Apache 2, Ruby and Ruby gems)

      sudo gem install passenger

      and then run,

      passenger-install-apache2-module

      Passenger installation screen

      Then you will be presented with the following guided installation process. Just do as it say and you're done!

      If you are using Dreamhost, Passenger comes built-in with the hosting package. You have to enable it for the domain you wish to host your app, through the Dreamhost control panel.

      Once you've installed (or enabled) Passenger, you could just upload the files using your favorite FTP program (or using SFTP/SSH)

      When you make updates, you will need to restart the application to reflect the changes. It's also not that difficult all you have to do is to create a blank file called 'restart.txt' on your application's tmp/ directory. You could do this by running following command in the terminal.

      touch /webapps/myapp/tmp/restart.txt

      If you are used to automated deployment using Capistrano, deploying apps can be even simpler. Here is a sample recipe on how to do that.

      You need further information about Passenger, the offical user guide and following collection of resources can be helpful.

      Kudos

      Few months ago there were lot of dispute regarding complexity of Ruby deployment, then DHH (creator of Rails) openly invited to someone to tackle the challenge rather than just complain. In this context, Phusion, a small company in Dutch came up with Passenger. It's a truly a great effort which made our lives easy. Kudos for the fine folks over at Phusion, you guys really rock !

      ]]>
      Passion http://laktek.com/2008/07/14/passion http://laktek.com/2008/07/14/passion/#comments Sun, 13 Jul 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/07/14/passion Most of us do work hard in our lives, but how many of us achieve the goals we strive for? Complains such as "I never missed a class and never missed homework but I couldn't get through the exam" or "Though I work 12 hours a day I'm not getting a promotion in my job" are very common in our society. Is it just the bad luck or is there any better reason for this? As I feel it's mainly because we don't have a true passion on what we do.

      crammingSuccess don't have a strict co-relation to how much effort you put in or how much time you spend on it. There may be subjects you don't understand a shit, but you could parrot read and score good grades in exams. There may be jobs which you hate, but gives you an increment if you don't take a leave whole year. But can we define these as success? At the end of the day, if you feel unsatisfied over what you do and if you don't see it adding any value to your life, it may not be your real passion.

      People who cram, always lives in uncertainty, "What if the structure of the paper get changed this year?" "What if Java language looses its popularity?". If you have a real passion, you may not feel insecure about your future. Passionate people can easily adapt to the changes. They could even foresee those changes. How? Because they are ready for the challenge. They have gone to the depths of the subject and they are confident about themselves. Also as I've experienced if you have passion, there's no shortage of opportunities.

      Can we create passion? Passion for something involves your emotions, it's something you love to think, love to talk and it will never make you feel bored. Most of the time passion matches with your core skills. Loving something because others succeed in it or others earn better in it is not the true passion. So it's better to think about yourself and try to understand what is your true passion.

      Photo Credit : Cram time (winter) by Pragmagraphr - http://www.flickr.com/photos/sveinhal/2075747765

      ]]>
      Still Alive. http://laktek.com/2008/06/18/still-live http://laktek.com/2008/06/18/still-live/#comments Tue, 17 Jun 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/06/18/still-live I know most of you may have already assumed this blog is dead and removed from your RSS readers too. However it's not..I always have the intent to keep on blogging, but lack of consistentcy and pure laziness of mine didn't really help the cause.

      During the past six months of silence, actually I tried to realign my thoughts, my life. To be frank, I was not happy where my life was heading and with the things happening around me the future seems to be filled with lot of uncertainty. When I was 17, I had a dream where I wants to be and today I have almost achieved what I dreamt. Only thing is when I taste the reality of those dreams today, it doesn't feel much comfort and cosy as I wished as a teen. However today I don't see where I would be after 5 years (actually I don't want to see). I cannot predict how my environment will change in these 5 years. I will have to adapt to it on whatever the circumstances. That's why I feel more comfortable living for the day and not having concrete goals. So I could take the opportunities as it comes and could live without worrying on what I've achieved. At the end of the day, if I could bring some happiness to the people around me, then I could feel satisfied. Past few months, I tried live in this approach, by spending more time with people who are closer to me and also engaging in some new activities. It actually gave me much better feeling than just being stuck with geekery.

      If you still read this and wonder whether I've lost all my appitetie for hacking... Not quite so. During this period I also worked with my colleagues at Vesess, developing a simple billing app called, CurdBee. After lot of effort, we were able to make it available for public yesterday. It's free, so you like to try it register for an account. It's our first mass market web app and it gives me bit of self-satisfaction for being part of it.

      ]]>
      RMS in Sri Lanka http://laktek.com/2008/01/19/rms-in-sri-lanka http://laktek.com/2008/01/19/rms-in-sri-lanka/#comments Fri, 18 Jan 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/01/19/rms-in-sri-lanka with RMS

      The week ended was a great week for Sri Lankan FOSS community, as the father of Free Software Movement, Richard M. Stallman (RMS) paid a visit to the country. Yesterday, I got the opportunity not only to listen to a live speech of the legend, but also to grab a picture with him. It was at the main public event which held at SLIIT, Malabe, which was a full house !

      RMS delivered a humors and really enlightening talk, which made everyone to have a self-retreat and understand how they are tied of with non-free software. Another important issue he brought up is the use of non-free(proprietary) software in schools and universities, which leads to tie users into these evil software for their entire lives. I think this issue should be taken seriously by developing countires like us, where we dream of a having standalone and stable economical environment without getting arrested by the multi-national firms. In the next decade this issue will take more concern with the growth of the IT market. It's important for the country to produce IT professionals who know the concepts solidly without being dependent on the software to achieve that. Use of free software could provide ideal foundation for this.

      Also in his speech RMS mentioned the easiest way for anybody to contribute and advocate free software. It's by always calling the system GNU/Linux (not Linux only). If you are a keen follower of the FOSS world you will know this is the longest standing holy war in the community, but for me it seems GNU/Linux is the term we should use. Because GNU/Linux referrs to the great philosophy behind the whole movement not just the software.

      It's a pleasure to see such great people here in Sri Lanka and kudos for ICTA for their efforts in this endeavor.

      Hail the St.iGNUicious (alias RMS)!

      ]]>
      It was all due to a fat finger ! http://laktek.com/2008/01/16/it-was-all-due-to-a-fat-finger http://laktek.com/2008/01/16/it-was-all-due-to-a-fat-finger/#comments Tue, 15 Jan 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/01/16/it-was-all-due-to-a-fat-finger After that bizarre story of Microsoft's support team calling a client back after 10 years, similar thing happend to me yesterday, from Dreamhost (my hosting provider). They have sent me a payment overdue notice as of February 1st, 2009. Initially, I was wondering whether these tech companies are doing time travelling ? However later revealed as in the case of M$, Dreamhost too have made a typo while entering the figures....

      Today I've got an apology from Dreamhost billing team, and it almost made my day :P

      Hi Lakshan!

      Ack. Through a COMPLETE bumbling on our part, we've accidentally attempted to charge you for the ENTIRE year of 2008 (and probably 2009!) ALREADY (it was all due to a fat finger)!

      We're really really realllly embarassed about this, but you have nothing to worry about. Please ignore any confusing billing messages you may have received recently; we've already removed all those bum future charges on your account and fixed everything up.

      Thank you very very much for your patience with this.. we PROMISE this won't happen again. There's no need to reply to this message unless of course you have any other questions at all!

      Sincerely, The Foolish DreamHost Billing Team!

      Maybe adding humor is the best way to cover your mistakes...

      ]]>
      What's your next move after A/L ? http://laktek.com/2008/01/03/whats-your-next-move-after-al http://laktek.com/2008/01/03/whats-your-next-move-after-al/#comments Wed, 02 Jan 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/01/03/whats-your-next-move-after-al Results of the G.C.E.Advanced Level (A/L) examination for year 2007 was released yesterday and as usual we got to see lot of drama with mixed emotions over the results. I don't know whether there is any other more competitive exams in the world than A/L's in Sri Lanka. This is due to A/L is the pathway to enter to State University System (which provides free higher education). Anyhow this opportunity is limited, out of 250,000 students who sit for A/Ls only about 20,000 will get a chance to enter to a state university. So for many students (or at least their parents) what matters is not passing the exam, but passing it with flying colours to get selected to a State University (Unfortunately there were some cases even 3-A passes weren't adequate for university entrance). For students from families with lower incomes, this is the only hope in changing their living conditions. In contrast for parents in middle class, their child selecting to a state university is more a symbol of social esteem.

      However, it's very common that students are left with no idea of what to do after their A/Ls or the path they take (or forced to take) would lead them nowhere. As I see there are four common paths a student may take after their A/L, I would like to share my thoughts on the consequences of each of these, based on what I've experienced and seen. However, I'm just a undergrad and these thoughts are only from my point of view. Hope you would take them as a grain of salt. If you have a different viewpoint please do share them through the comments.

      Repeating the Exam

      This may look as the obvious step for the students who failed the exam, but in reality this is the option taken by most of the students including the ones who passed the exam. I know of students who start preparing for their 2nd attempt even before the results are released. I wonder the reason behind this, maybe lack of confidence on yourself ? As I feel, it's important to pass the A/Ls, since it's the stepping stone for any career or any higher education opportunity. So if you've failed the exam, it's better go for a 2nd attempt and try to pass the exam. What about the students who passed the exam, yet thinking of an second attempt ? There is a perception among some students (or their parents) that the sole intention of A/L should be getting selected to a state university to follow Medicine, Engineering, Management or Law. As I feel this is the myth that makes A/L more competitive and a rat race. There are many more better opportunities exist even in the state universities itself (such as IT, Microbiology, Textile Design and Industrial Management) than the above four disciplines. Actually in today's scenario, I don't believe one career could be better than another in terms of opportunities or perks. It's actually the person involved in the career can make that difference. So my advice, if you have gained a good z-score look for other degree courses available in State Universities and if you feel interested and have passion in that area apply for it. Think wisely, what do you want to be and your skills before wasting another precious year in your life repeating the exam.

      Enroll for a degree in a State University

      As I mentioned earlier do a reality check on your interests and passions before selecting a degree course to follow. Remember this decision will affect your whole career ahead. I have seen people becoming frustrated on their careers after a short period, and just doing the job for the sake of doing (which is even true for some doctors and engineers). I guess this is also one of the reasons for the low productivity in the country.

      My advice is don't just select a course just because it has a higher cut-off z-score. Imagine you have a great passion in Architecture but you have a Z-Score to become eligible for Engineering, what would you select ? When you have the potential to be the next Jeffrey Bawa, why would you settle as an average civil engineer ?

      So don't allow z-score to go over your dreams ;)

      Following a Professional Course or enroll for a Private Degree

      The facts I mentioned in the above context also applies here. Try to find unique areas of study which match with your capabilities. So don't just settle do CIMA, because your next door neighbor does it.

      Apart from that another thing you should consider is the quality of the private institute you are going to enroll. What are the professional bodies they are affiliated with, if they are providing a degree, is it a internal or external degree, what sort of facilities available (libraries, labs, lecture rooms) and what is the demand for these field in the job market.

      Finding a Job

      Finally, next popular option of among many students is to find a job after A/Ls. Actually, what you should be targeting is a career rather than a job at this point of the life. Don't settle for jobs where you don't see a clear career growth. Most of the people blindly fall into the trap of high starting salary, without looking at the opportunities available in evolving the career. Look carefully whether the job will lock you in doing the same task for the entire life. At least the job you take should encourage you to follow a professional course by providing flexible working schedule.

      ]]>
      Goodbye 2007 http://laktek.com/2007/12/31/goodbye-2007 http://laktek.com/2007/12/31/goodbye-2007/#comments Sun, 30 Dec 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/12/31/goodbye-2007 So today it see the end of another wonderful year in our lives. It's hard to believe how fast time travels, it was like yesterday I remember the dawn of year 2007. However, this constantly reminds us how short is our lives and how limited the time we got in our lives to do something worthwhile.

      Personally I feel satisfied on how the things went for me in 2007. Participation in Google Summer of Code was the major highlight for me in the year. Meanwhile, I was able to do good in my career at Vesess and in my degree. Still I see there is a room for improvement in terms of efforts and the skill. So my focus for the next year would be to working harder and become ruthless.

      Talking about the internet industry, we can call 2007 as the year of Social Networks. The rise of services like Facebook and Twitter, showed human factor is more important than the technology. People seemed to have become more comfortable with these services and social networking capability have become essential element for every successful internet application.

      Predicting about 2008, I feel it would be the year the mobile computing becomes mainstream. Along with this web based applications will take the lead over desktop applications in 2008. These two breakthroughs together will by empower the business and social lives of humans. We will see more sophisticated mobile devices inspired with Apple's iPhone (awaiting for phones with Google's Android Platform) and finally we may see WiMax (802.16) supported devices in the market. This would create a huge opportunity for web based applications in the coming year. Easy access for from any location, improved user experiences and more social networking capabilities will be the things to consider in building tomorrow's web applications.

      ]]>
      It's your chance to fly high with Google http://laktek.com/2007/11/29/young-brothers-its-your-chance-to-get-high-with-google http://laktek.com/2007/11/29/young-brothers-its-your-chance-to-get-high-with-google/#comments Wed, 28 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/29/young-brothers-its-your-chance-to-get-high-with-google Google Highly Open Participation - LogoWhen I was in college, I envy Google Summer of Code for being only open to university students. Maybe it might have indirectly influenced me to get into the university ;). Anyway I ended my envy later by participating in GSOC. If you are also a college/high school student under 18 years and have such envy as I had, then here is a news for you ! Google has announced "The Google Highly Open Participation Contest".

      This can be called as the pre-university version of Google Summer of Code, where the objectives are similar - to get involved youngsters to contribute for the open source projects. Here you will have to work on tasks of the 10 listed open-source projects and you have to complete them by 4th February 2008. So I guess timing is perfect as you will be free from your school work in the holiday season. Also attractive prizes are on tray for successful participants. You will receive a certificate and tshirt if you complete a task and you will be paid $100 for every 3 tasks you complete. The 10 grand prize winners will be selected and will have the opportunity to have a paid trip to Googleplex (wow!)

      If you are considering to participate, then I would like to recommend you the project, SilverStripe CMS. SilverStripe codebase is purely XHTML, CSS, JavaScript and PHP5 based, so you will not require a high knowledge to start hacking. Also SilverStripe is a cool project which has lot of potential and you will be delighted to be part of such elite community ;). From my summer of code experiences I can tell you that the SilverStripe got very friendly community, so you will never run out of support. Have a look at SilverStripe's tasklist - http://code.google.com/p/google-highly-open-participation-silverstripe/issues/list. You will notice there are variety of tasks such as testing, designing and documenting where you don't even require programming skills.

      Anyway if you are considering to be a GHOPer under SilverStripe I'm more than willing to help you in whatever the ways I can. Get in touch with me.

      ]]>
      My contributions released officially ! http://laktek.com/2007/11/29/all-my-gsoc-contributions-released-officially http://laktek.com/2007/11/29/all-my-gsoc-contributions-released-officially/#comments Wed, 28 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/29/all-my-gsoc-contributions-released-officially Flickr GalleryYesterday along with its feature rich 2.2 release, SilverStripe has officially released the 3 modules I worked in the GSOC. This is a great achievement for me than successfully completing GSOC project itself. It's a great pleasure to see people around the globe consuming the my code to get the things done.

      The 3 modules will make your life easy when reusing the content available in Flickr, Youtube and Technorati in your site. So you could come up with wonderful mashups in SilverStripe platform. If you are a photographer, media creator or a blogger try using these modules with SilverStripe to create truly customized site for yourself (you'll amaze to see how easy it is).

      Love to hear your feedbacks and suggestions.

      Download SilverStripe 2.2 : http://www.silverstripe.com/downloads/

      Download the mashups modules from here : http://www.silverstripe.com/modules/

      ]]>
      Twitter Widget for SilverStripe CMS http://laktek.com/2007/11/15/twitter-widget-for-silverstripe-cms http://laktek.com/2007/11/15/twitter-widget-for-silverstripe-cms/#comments Wed, 14 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/15/twitter-widget-for-silverstripe-cms I released a new widget for SilverStripe CMS, which allows you to show your Twitter status in your SilverStripe based blogs. More Info and Instructions for usage.

      There will be more interesting stuff coming up for SilverStripe along with the 2.2 release. With the new version SilverStripe further simplifies your effort in building and managing web sites. So try out SilverStripe for your next web site and feel the difference..

      ]]>
      The Linux Desktop http://laktek.com/2007/11/09/the-most-fancy-desktop-os http://laktek.com/2007/11/09/the-most-fancy-desktop-os/#comments Thu, 08 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/09/the-most-fancy-desktop-os Last night, at last I decided to upgrade my machine to Ubuntu 7.10 (Gutsy Gibbon). In fact it's been almost one month since the release and I was bit reluctant to go for the upgrade early. I was bit busy with development work lately and my machine was stable with Fiesty Fawn (7.04). So I didn't want to take the gamble of upgrading early. Also, I experienced few chuckles in last upgrade with high demand hitting the Ubuntu servers and broken repos. The upgrade went smoothly and this seems to be the most stable Ubuntu version ever !

      With the new version of Linux kernel included with Gutsy, it seems the lifelong issues I had with the graphics and sound card are over. It seems sound works normally when resuming from a hibernation and also default speakers are muted when I plugin the headsets. These two trivial issues were bugging me in the past and I never found a workaround :) What I love most is Compiz Fusion working out of the box, without needing any extra effort to configure.

      ]]>
      Walkthrough of the Redesign http://laktek.com/2007/11/03/walkthrough-of-the-redesign http://laktek.com/2007/11/03/walkthrough-of-the-redesign/#comments Fri, 02 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/03/walkthrough-of-the-redesign When I first launched my blog the main intention was to make my mark online. Without sticking into blogspot (Anyway, I had a blog on there for a short stint) or wordpress.com, I decided to go one step ahead and host the blog in my own domain with a customized template. That was a good move at that time and it did caught the eyes of several people. The blog enabled me to get into networks and I had a niche readership community composed mainly of local geeks. That was fine for a start.

      Later when my other engagements became priorities, as usual the focus for the blog got diminished. Also the readership didn't get expanded and it was stalled for a while. Some of my friends criticized about the blog being stagnant and from them I learnt the presentation of the blog too doesn't appear attractive enough for casual readers. In conclusion, I felt this blog needs an overhaul in terms of both content and the layout. Hence, I came up with redesign.

      It took me about one week to finish the whole thing and I'm constantly tweaking it to look better.

      Site Structure

      I felt having too many posts on homepage would make difficult for a reader on grasp the content. I decided to go with the tabloid style with giving more focus for the latest article and having next 3 most recent posts in a secondary level. Hope this will give better attention to each single post.

      Apart from original musings, I've added a separate section called 'masterpieces', to share the great stuff I find on the net. On the homepage five most recent masterpieces will be displayed. I recommend you to subscribe to its RSS feed and catch every beat (http://feeds.feedburner.com/laktek/masterpieces)

      Another new addition in this redesign is tagging for posts. I think the tag cloud on the footer will give a first time visitor a clear idea of the flavor of the blog.

      Design

      This is my first site designed based on the grid based layout. It was bit of an experiment and I really liked the concept. Anyway, it was Blueprint CSS framework made the process so easy. I went with the default 24 column grid and it was so convenient in positioning the elements. Blueprint also have a nice built in reset and typography styles, which took off load of css hackery from my ass. Oh ! should not forget the print stylesheet comes with it, which makes the site look smart in print without any effort.

      As a developer, I really love the Blueprint CSS. It made my life easy taking care of all those stuff I hate to do with CSS. That allowed me to focus more on the look and feel. My design buddies had slightly different idea of this concept, they felt it as somewhat bloated and too heavy. But for a occasional designer with development attitude Blueprint is the answer! (Remember how much developers hate the structure of CSS code itself). Hats off to Olav Bjorkoy for creating such awesome framework.

      The Platform

      Though I was looking at alternatives to replace Wordpress as the platform, I finally decided to stick with WP. Mainly for the stability of WP as a blogging system. This site now runs on WP 2.3.1 and basically WP is matured enough to cover all basic needs of an web blog from tagging to comment spamming. Also due to the large community around WP you never runs short of plugins. If you wished to have some feature, then some other has already done with it and also been generous enough to present it as a plugin.

      Anyway my only issue with WP is the templating system. That may be because I was working too much on MVC in the recent past. Feature of MVC based systems such as Silverstripe and Ruby on Rails is having clean templates. The clear separation of functionality from the view makes the life easy.

      Thats how the things went basically and I feel bit happy of the outcome. Anyway its success is thoroughly based on how you, the users anticipate and how much you find this place useful. That's the essence of user-centered design. So drop your comments, those will really help me to make this better.

      ]]>
      The Renaissance http://laktek.com/2007/11/02/the-renaissance http://laktek.com/2007/11/02/the-renaissance/#comments Thu, 01 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/02/the-renaissance It's been more than two months since the last post in this blog, you would have thought this also as another blog getting into deadpool. If so your wrong ! Since I started this blog 1 1/2 years ago it's been the one which gave me the platform to air my thoughts and it brought me many opportunities. The temporary silence was just a retreat to comeback more strongly equipped. So here I'm again taking my next round in blogsphere with a completely redesigned site.

      Take a look around the site and drop your comments about the new blog. Watch for the next post for my complete walk through of this redesign.

      ]]>
      Sweet Summer of Code, Thank you Google... http://laktek.com/2007/08/25/sweet-summer-of-code-thank-you-google http://laktek.com/2007/08/25/sweet-summer-of-code-thank-you-google/#comments Fri, 24 Aug 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/08/25/sweet-summer-of-code-thank-you-google GSOC 2007 finally came to an end on last 20th, providing loads of experience and lots of other good things to take with us. It helps to create passion for Open Source software among students, by providing an opportunity to work with more than 100 OSS projects in their summer holidays. Even it implies as a summer vacation program any university student in the globe can participate in the program. So students like me who is actually doesn't have such summer vacation (In Sri Lanka we don't experience seasons, thus no summer holidays) are also eligible to participate. Best thing for some, is the $45K stipend paid by the Google for the work done in the program. Honestly it's a great relief for our academic expenses for the year, but what I believe is the experience and reputation the program brings what is more important in long term.

      Talking about reputation, it's great to be in a community of different people all around the world (almost in every continent), who knows you by your first name. This was mainly due to Silverstripe was a small tightly knitted community since it was a budding up project. I guess I made the right move by applying to upcoming project like it rather than a big project like GNOME, Apache or Drupal where I would be just another contributor as many others. Silverstripe valued our work very much and took lot of effort in giving us the exposure. Most wonderful thing is our work being featured in the tech talk Silverstripe presented in Google Headquaters (thanks Sig!)

      I haven't done much contribution to FOSS prior to GSOC, but now I feel as a developer it's very essesntial to work on FOSS project in spare time. So I'll be continuing my work with Silverstripe CMS (and even with some other projects if I get more time). Not only it brings you credibility and reputation, but also you could polish your skills by learning from the masters.

      Here are some lessons I learnt from the GSOC, hope these help you as well.

      Polishing your coding skills

      None of the programming courses could teach the art of programming or the best practices. Those do only come with the experience. It may be simple as the placement of a parentheses or line indention but it's better to do it correct way. In Open source projects, where the documentation is rather mediocre, it's the code which does act as the documentation or the spec.

      In Silverstripe they had strong concern on the coding standards and best practices. Most of the stuff I learnt about the project was by looking at the code itself. Similarly many will try to understand how my module works just looking at it's source code. So lot of careful thought was needed on coding. Inspiration from the seasoned hackers of the core team gave me lot of inspiration (especially my mentor, Matt) to do the things in the right way.

      Now I treat coding as a creative exercise, like painter doing an art on a canvass. It's not just copy-paste and changing some stuff. This is something I would never learn by on my own, even if I do tons of freelance projects.

      How to set achievable goals

      GSoC is a program with a limited time frame (and any other project). Eventhough we may come up with a mind-blowing proposal due to practical constraints it's difficult to achieve all those within the period. So it's essential to break the the project into several chunks and achieve them in several iterations. This will allow you to focus on a single small target than having a large wayward target.

      Working under pressure

      One of the major challenges I faced in the GSOC was balancing academic stuff and project work. As I mentioned earlier in Sri Lanka we didn't have summer holidays so GSOC went along with the normal semester work. I couldn't make this an excuse since I assured to complete the proposed work in that time frame. Yet I didn't want to throw away this great opportunity came in my way. At the end, I felt I could keep the balance on both and was able to complete almost every assigned task by the deadline.

      BTW, I learnt that I must be willing to take such risks, if I need to go beyond an average.

      Communication and Collaboration

      Working with a team of developers who lives across the globe, in different timezones was new experience for me. Lot of decisions were taken based on the communication done via IRC, forums and Skype.

      Collaboration in development was mainly done with the use of Source code management system (Subversion) and Trac. Silverstripe offered special branch in the repository to GSOC, which enabled us to freely commit our patches without worrying much on confilcts with the trunk. Also they created separate areas for modules we develop in SCM and gave us the total control of the module.

      Supporting the Community

      Apart from development, developer in a open source project must be willing to help the community to use the product. Responding to questions coming from forum or IRC, was also part and parcel of the project assignment. I also did documented the usage of the modules I developed, as much as possible. As I understand support rendered by the developers to the users is one of the key factors for the success of an open source projects. If more support is available, more people are willing to try out the FOSS products.

      At last, I must not forget the surprise gift sent by the Google to us. Producing Open Source Software written by Karl Fogel, acted as an bible for me to understand the concepts and attitudes needed to have as an Open Source developer.

      ]]>
      JAVA to traded in Stock Market ?? http://laktek.com/2007/08/24/java-to-traded-in-stock-market http://laktek.com/2007/08/24/java-to-traded-in-stock-market/#comments Thu, 23 Aug 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/08/24/java-to-traded-in-stock-market Jonathan Schwartz has announced that SUN microsystems is going to change their stock trading symbol from SUNW to JAVA. This move gives an implicit meaning that the Java technology is going to be traded and people would be fooled to believe they are going to own the shares of JAVA technology not SUN microsystems. One would wonder was this the real motive behind releasing JAVA as an open source technology ? Spread the technology everywhere and use all it's popularity to raise an organization's stock value.

      ]]>
      Silverstripe at Google Tech Talks http://laktek.com/2007/08/03/silverstripe-at-google-tech-talks http://laktek.com/2007/08/03/silverstripe-at-google-tech-talks/#comments Thu, 02 Aug 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/08/03/silverstripe-at-google-tech-talks Founders of Silverstripe, Sigurd Magnusson and Sam Minnee was recently invited to do a Tech Talk at Google. Watch their talk about New Zealand, starring in Lord of the Rings, how they destroyed the ring and Summer of Code.

      SHAMELESS NOTE : Watch around 27th minute to see Sigurd presenting the work I've done with GSOC.

      (You can download the full video from the Google Videos)

      ]]>
      Love Puzzles ? Try these... http://laktek.com/2007/07/30/love-puzzles-try-these http://laktek.com/2007/07/30/love-puzzles-try-these/#comments Sun, 29 Jul 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/07/30/love-puzzles-try-these Warning : These may be quite addictive and would require you to consume lot of brainpower :)

      Da Vinci's Other Code - http://www.coudal.com/davinci.php School of Government - http://www.coudal.com/theotherfish.php Einstein's Fish Puzzle - http://www.coudal.com/thefish.php

      AND

      Which Porn Star Ate the Most Hot Dogs? - http://www.coudal.com/hotdog.php

      (via Coudal Partners)

      ]]>
      Refactoring Catalog http://laktek.com/2007/07/17/refactoring-catalog http://laktek.com/2007/07/17/refactoring-catalog/#comments Mon, 16 Jul 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/07/17/refactoring-catalog Refactoring holds same level of importance as coding the behavior. I enjoy refactoring my code as it leads to much cleaner code and also gives you the knowledge of how to do something better. But identifying what and where to refactor is something that a programmer gains through the practice.

      Recently I found a great catalog of commonly used refactorings, which is compiled by Martin Fowler.

      Visit or bookmark the link - http://refactoring.com/catalog/index.htmlÂ

      ]]>
      Enjoying my Summer Of Code with SilverStripe http://laktek.com/2007/07/11/enjoying-my-summer-of-code-with-silverstripe http://laktek.com/2007/07/11/enjoying-my-summer-of-code-with-silverstripe/#comments Tue, 10 Jul 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/07/11/enjoying-my-summer-of-code-with-silverstripe First of all, I beg your pardon for not updating you with my summer of code experiences as I promised. Once I get busy with coding I tend to forget all other essential stuff which needs to go along with hacking. Yeah I know that's not a good practice as a FOSS developer and trying to get rid of that :)

      Anyway I had a very productive first half with SilverStripe CMS and it's a joy to see SilverStripe has already released parts of my project as modules. Also it's been fun to work with such courteous and like-minded bunch of developers who were from all parts of the globe. Compared with other FOSS organizations, SilverStripe is very young and trendy organization which would make a big impact on FOSS world in the years to come. Thumbs up for Google for supporting such a budding up organization through GSOC, if not for GSOC I'll never know project called SilverStripe CMS & Framework exist.

      So you'd be eager to know what I did with my project in last 1 1/2 months... If you recall the main objective of it was to provide mashup capabilities to SilverStripe CMS & Framework. I decided to achieve this by providing connections to web services via a RESTful interface. I prefer REST over RPC and I hope to talk about it in more depth in a separate post.

      As a summary, these are the tasks I completed during the period :

      1. Created RestfulServices class which can be extended to support any Restful web service API. (more info)
      2. Built FlickrService API extending RestfulServices. (more info)
      3. Created FlickrGallery controller, a page type users could add to their SilverStripe CMS to display set of Flickr photos as a gallery. (see the demo)

        Interesting thing about this is users could add this functionality completely using the WYSIWYG editor without requiring to write a single line of code (How's that sound ? :) )

      One of the complaints I got from my friends when I suggested them SilverStripe CMS was the lack of ready-made themes for it. Yeah, you get the default yet spiffy BlackCandy, but people just love to see more options. So to fill that vacuum for some extent I created the PaddyGreen theme, which I would release in the next few days. I wanted to make it trendy, simple and customizable, but there is still room for improvement. This is not part of my GSOC proposal it's just a pet project.

      What I've mentioned is just a tip of an iceberg, there are even more cool and great stuff to be released in the next few months to come from myself and my fellow GSOCers. So if you are in search of a user-friendly CMS or framework for your next project consider SilverStripe as well :)

      ]]>
      Let's bid farewell for Prof.V.K.Samaranayake http://laktek.com/2007/06/08/lets-bid-farewell-for-profvksamaranayake http://laktek.com/2007/06/08/lets-bid-farewell-for-profvksamaranayake/#comments Thu, 07 Jun 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/06/08/lets-bid-farewell-for-profvksamaranayake Today would be a day of mourning for Sri Lankan IT industry, with the demise of Sri Lanka's "Father of IT", Prof.V.K.Samaranayake. Many will unanimously agree today Sri Lanka's IT industries successes are mainly due to visionaries of Prof.Samaranayake.

      Beginning with CINTEC and later through ICTA, the work he has done for the development of IT industry in the country will be ever remembered. Introduction of BIT external degree program which enabled many students to get into IT field, Fueling the FOSS community by helping to launch the LSF, Sinhala language support in Windows and E-government project which digitized the procedures of many government institutions are some of the notable projects which took place under Prof.Samaranayke.

      The vacuum left by his decease would never be filled and whole country will miss him in the years to come. I give my sincere condolences to the family of Prof. and may rest in peace!

      ]]>
      Daily reads for wannabe WebDs http://laktek.com/2007/05/22/daily-reads-for-wannabe-webds http://laktek.com/2007/05/22/daily-reads-for-wannabe-webds/#comments Mon, 21 May 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/05/22/daily-reads-for-wannabe-webds This is a collection of sites, which I recommend for daily use for the people who are eager to pimp their web design/development skills. This was initially prepared to send to my fellow batch mates at the university, but later I thought why not share it with the rest of the world too :) These resources may not be nothing new for seasoned players in the game, but I hope noobs may have something to reap.

      Have any other recommendations on your personal experience ?

      ]]>
      .app TLD for Web Applications http://laktek.com/2007/05/18/app-tld-for-web-applications http://laktek.com/2007/05/18/app-tld-for-web-applications/#comments Thu, 17 May 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/05/18/app-tld-for-web-applications Web applications have become the next trend of the internet and it seems the day it'll replace traditional desktop apps are not that far. But still web applications aren't the easiest to use for average users one of the major barriers is accessing the application itself. As I mentioned in my previous post most of the web applications have got confusing domain names.

      This is mainly due to the unavailability of top level domain extensions for simpler and common terms. So how about introducing a separate TLD for web applications ? I feel .app would be the most suitable extension. It will make web application stand-out from the information only web sites. This would be a similar concept to .mobi extension which was intended to serve only Mobile Content.

      What do you think about this idea ?

      UPDATE

      Flashback to 2011, this idea is on the verge becoming a reality. Since ICANN opened the doors for registration of custom gTLDs, there are several non-commercial initiatives such as http://dotappapp.com/ trying to secure the rights for .app gTLD. They are currently in the phase of raising the necessary funds for ICANN consideration.

      ]]>
      Confusing Web 2.0 Domains http://laktek.com/2007/05/14/confusing-web-20-domains http://laktek.com/2007/05/14/confusing-web-20-domains/#comments Sun, 13 May 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/05/14/confusing-web-20-domains How do you define a web 2.0 application ? Normal distinguishable features includes AJAX, rounded corners, large input fields and badges. But did you notice a another feature which is common to most of the apps of web 2.0 culture ? Most of them have confusing and difficult to spell domain adresess or URLs.

      It's a PITA when introducing these apps to my non-geeky friends. Here is how normal conversation would follow.

      me : Do you know that cool project management tool called basecamp ?

      buddy : Nope. What is the site basecamp.com ?

      me : no pal u hav to add hq to end it's basecamphq.com

      buddy : or maybe I'll visit to company site and use the product that seems much easier.

      (types the URL)

      buddy : There is no site called ThirtySevenSignals.com ???

      me : no no mate it's 37 numeric 3 and 7!

      buddy : Oh! this makes me sick. Show me something not that geeky in web 2.0 world...

      me : ah! you gotta visit flickr that's awesome photo sharing service :)

      buddy : What ??? seems like they are out of the business that domain is for sale

      me: Are u gone crazy ?? I'm using it now

      buddy : I'm not it seems you are. You said F-L-I-C-K-E-R.com right Flicker.com ??

      me : nope buddy it's F-L-I-C-K-R not E-R ..

      buddy : how can remeber all these weired names ?

      me : thats why you should use a bookmarking service to save ur brain from interpreting these URLs

      buddy : tell me such a site

      me : Delicious

      buddy : site may be delicious but how do go to it ?

      me : Just type DEL dot ICIO dot US buddy : what DEL ? you mean the DEL key in the keyboard ??

      me : Just GOOGLE for the site

      buddy : excuse me how many O's in between the 2 G's ?

      me :

      ]]>
      City under attack ? http://laktek.com/2007/04/29/city-under-attack http://laktek.com/2007/04/29/city-under-attack/#comments Sat, 28 Apr 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/04/29/city-under-attack Update on the current situation in Colombo (exactly here in Wattala). Still nothing is yet to confirm and moments passed by were filled with fear and uncertainity.

      We got electricity back in few minutes ago (around 3.50 am) and now the whole city seems calm and quiet. Still there is no any official update on what was happend few hours ago. But it seems something serious did took place over the skies of the city.

      We were all glued to the televisions, watching intensely the Sri Lankan run chase against Aussies in World Cup finals. Suddenly around 1.10 am there was a power failure, blaming at electricity board for poor load balancing me and brother went to room expecting we will get the electricity in few minutes.

      Just then we heard the sound of an aero plane going along the sea side and we didn't take it much seriously at that time. But me and my brother did talk about the possibility of an air raid over the country at this time, where all are concentrating nothing but on cricket.

      Suddenly around 1.30 am we heard a noise similar to of firecrackers and then suspicious loud blasts heard which is definitely not of firecrackers. Then I called one of my uncles (they had electricity in their area) and he said it should have been just fire-crackers as Sanath did hit 3 successive boundaries at that very moment. We also then thought the same as whole city were enjoying and partying. Meantime we were trying to contact electricity board to ask what went wrong with our power line.

      Around 2.00 am, another uncle from Kandy did called and said about this news of Air-raid over the city. Few phone calls afterwards, did confirmed the above story. At this time there were lot of rumors were flowing about attacks in air-port, harbour and Kelanitissa Power plant and it was also certain that about 80% of the city is in dark at that moment. When I did check on official web site of ministry of defense and tamilnet over GPRS there was the news of possible air strike in Colombo.

      Around 3.00 am, again we heard the sound of an air-craft flowing over, and what did we saw were tracer bullets are being fired at it. We heard clearly the noise of a gun-fire.

      As I'm writing this again I heared some firing and not sure what it's all about. Again now it's quiet and it's 4.37 am in Colombo.

      UPDATE : Official reports now confirms about the bombing of oil power plants in Kerwalapitiya and Kolonnawa. Kerawalapitiya plant is in very close proximity to our house in Wattala.

      ]]>
      One step away from the ultimate title http://laktek.com/2007/04/25/one-step-away-from-the-ulitmate-title http://laktek.com/2007/04/25/one-step-away-from-the-ulitmate-title/#comments Tue, 24 Apr 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/04/25/one-step-away-from-the-ulitmate-title Whooo! They have done it again ! Mahela and crew are one step away in bringing glory back to this small island in the Indian Ocean. Congrats guys ! It's a marvelous display of sportsmanship from our beloved cricketers !

      Cricket is more than just a sport in Sri Lanka. It's a passion. Whole nation has gone crazy with our superb performances in this world cup. Simply now Sri Lankans have begun to eat cricket, drink cricket and live cricket !!! All will be praying to see a repeat performance of 1996 (or even better) on 28th of April.

      I was in 5th grade when Arjuna lifted the cup at Gadaffi Stadium, Lahore. 11 years have gone since that glory day and it seems the enthusiasm for the game of mine have never changed. Actually it had got better like Sanath's batting :) Eventhough I'm just a TV spectator, I would dance in ecstacy and weep in disappointment (like yesterday when Sanath threw his wicket away) like I'm there in the middle with them. I think it's not only me but all Sri Lankan fans do act in the same way. Now it's 6.30 a.m. in normal working day here in Sri Lanka and still I don't hear signle noise of a vehicle on the road (not even the noise of trains). Seems like whole nation is still relaxing and enjoying last night's great victory (match ended around 3.30 a.m in the morning) and today will kick-off rather late to Sri Lanka :)

      Also talking about our team, I think the current side is much stronger and balanced outfit than what we had on 1996 world cup. Sanath, Murali and Vaas have grown out of experience and die-hard efforts of young blood such as Tharanga, Malinga are quite impressive. Also I haven't seen such aggressive and tactical captain as Mahela in the game. Truly we play our brand of cricket right now and I guess that would be the difference at the end of the day.

      A little trivia to end this post. Marven Atapattu was a member of the both Sri Lankan squads in 1996 and this time, yet he haven't had a opportunity to play in a single match in either World Cups (not likely to play in the finals too). I know it's always happy to be a part of a winning outfit, but wonder how Atapattu may feeling this..

      ]]>
      Accepted for Google Summer of Code ! http://laktek.com/2007/04/12/accepted-to-google-summer-of-code http://laktek.com/2007/04/12/accepted-to-google-summer-of-code/#comments Wed, 11 Apr 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/04/12/accepted-to-google-summer-of-code Today is a one of the greatest days in my life, as I got accepted to Google Summer of Code (GSoC) 2007. It's a dream coming true for me and looking forward to have some great experiences with it.

      Google Summer of Code, is a program organized by Google for University Students. Where they have to work for Open Source Project for 2 months, which is mentored by an organization. This will give huge boost for Open Source community as well as splendid experience for budding up developers. This year 900 students were selected worldwide out of 6200 applications, to work for 100 mentoring organizations.

      I will be working for Silverstripe CMS and Framwork. Silverstripe is been a great project and its really a fun to work with such a community. Silverstripe is basically is like a cross between Drupal and Ruby on Rails(ROR), which would be an easy platform to develop any kind of site. Its a delight to work with stuff I love most such as MVC, AJAX, Web Standards, etc. I hope I could contribute for the success of Silverstripe and make it a perfect framework for web development. I will be updating this blog with the further technical details of my project and I will share the experiences I gain.

      Also its a great delight to see my fellow Vesessins, Laknath and Amila are also getting accepted. The inspiration got from Prabhath's last years experience with GSoC, also did helped us immensely. Now we have 4 of our team with GSOC experience. This will give the world a clear indication of the quality of Vesessins. I wish Good Luck for my buddies too ! So looking forward for exciting and challenging days ahead :)

      ]]>
      Early Bird view to Google Summer of Code Project Ideas http://laktek.com/2007/03/08/early-bird-view-to-google-summer-of-code-project-ideas http://laktek.com/2007/03/08/early-bird-view-to-google-summer-of-code-project-ideas/#comments Wed, 07 Mar 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/03/08/early-bird-view-to-google-summer-of-code-project-ideas As most of you may aware (or not) Google Summer of Code for 2007 has been announced. This year I'm eager to apply for a project, eventhough I'm not sure whetehr I will get accepted. Anyway project ideas of accepted mentoring organizations will be released only on 14th March and student application period will be ended on 24th of March. Since I (and many others too) haven't been actively involve in projects with these organizations, I thought better to begin preparations early and select a good project. Most of the organizations have included a wiki page of potential projects.

      I thought of sharing the list of project ideas I compiled. Note that final project ideas of these organizations may differ and also not all of these organizations may get accepted to GSoC .

      UPDATE : Lot of new project ideas, which I missed have been added in the comments section. Check them as well.

      ]]>
      LakTEK is joined with Blue Fish Network http://laktek.com/2007/01/21/laktek-is-joined-with-blue-fish-network http://laktek.com/2007/01/21/laktek-is-joined-with-blue-fish-network/#comments Sat, 20 Jan 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/01/21/laktek-is-joined-with-blue-fish-network One of my new year perspectives was to give more attention to my blog and serve it to much bigger audience. Today I was notified by Blue Fish Network that they have added my blog to their network. I see this as yet another opportunity to expand my blog to a wider audience.

      Blue Fish is a UK based global Blog network, which currently includes 67 members. These members muses on wide array of subjects. I've been included in Technology stream along with other cool folks like Nick Barrett, Kevin Sylvia and Francisco.

      So I take this opportunity to thank Andy Merrett and rest of the Blue Fish Network for accepting me to your community. Hoping to build up some great relationships with this opportunity.

      ]]>
      Keeping the focus http://laktek.com/2007/01/13/keeping-the-focus http://laktek.com/2007/01/13/keeping-the-focus/#comments Fri, 12 Jan 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/01/13/keeping-the-focus For the first time in this year I'm going to touch my blog. Since I started this blog back in last May, I got great exposure by coming to know various cool people and involving in great projects. But when I was driven away by more work, attention to blog has been simply neglected. Actually I think at times, I filled the blog with more noise than signal. This year I'm going to give more focus on my blog and I hope the blog will also mature with experience.

      Year 2006 was a great year for me personally. I got into the university for my higher studies and also able to have a steady progress as in my professional career as a web developer. I got the opportunity in working on some great freelance projects, which I did really enjoyed. Besides that, I had the opportunity to join Vesess as one of their developers. Vesess has the spirit of a new age web startup and it seems to be a place I could really enhance my career.

      So the challenge in 2007 is to move up on the ladder, keeping up the same momentum. 2006 was an year of establisment for me. I had fulfilled my needs, next step is to keep on the commitment. That would be much harder, but as always I'm ready for the challenge.

      ]]>
      The tools I use http://laktek.com/2006/12/15/the-tools-i-use http://laktek.com/2006/12/15/the-tools-i-use/#comments Thu, 14 Dec 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/12/15/the-tools-i-use Here is the list of tools I use for my day to day activities.

      • Machine : Compaq Presario v3000
      • OS : Ubuntu 6.10 (Primary OS), Windows XP Media Center (OEM version, keeping it for testing)
      • Editor : Scribes (It's becoming the TextMate for GNU ! awesome editor)
      • Browser : Firefox 2, Opera
      • CLI : Bash
      • Web Host : Dreamhost
      • Framework : Ruby on Rails
      • CMS/ Blog : Wordpress, Drupal
      • Javascript Library : Prototype
      • Debugger : Firebug (yes, the new beta rocks !)
      • Project Management : Active Collab
      • FTP Client : FileZilla
      • Version Control : Subversion
      • Office Suite : OpenOffice
      • Image Editing : Photoshop CS2 (Main reason I cannot get rid of my windows partition), Google Picasa, GIMP
      • Email Client : Gmail (Including my POP3 accounts)
      • IM : via GAIM (IRC, Google Talk and Yahoo)
      • Distance Communication : Skype
      • Music Player/Manager : Amarok
      • Torrent Client : uTorrent via Wine (I haven't seen a better torrent client than that)
      ]]>
      Gmail realigns it's interface http://laktek.com/2006/11/10/gmail-realigns-its-interface http://laktek.com/2006/11/10/gmail-realigns-its-interface/#comments Thu, 09 Nov 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/11/10/gmail-realigns-its-interface It seems Gmail has also joined the rounded corner and drop shadow galore with it's upgrade to the interface. Even though there isn't a major upgrade in the main UI message interface has become much sweet.

      Also they have taken all the message actions to a top-right drop-down menu, which actually becomes quite irritating than the what they had earlier. However it seems google is going to have some become upgrades to Gmail. This may be just a tip of an ice berg.

      ]]>
      Semantics on Word Processing http://laktek.com/2006/11/05/semantics-on-word-processing http://laktek.com/2006/11/05/semantics-on-word-processing/#comments Sat, 04 Nov 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/11/05/semantics-on-word-processing As an adamant fan of web standards and semantically correct web design, I always tried to avoid using tables in web page layout design. But I never thought of taking that practice beyond web design.

      However today I got a call from a less-tech savy person, who wanted me to help him with a layout of a Word Document (yes it was M$ Word). He wanted to have a two column layout in a landscape A4 paper. Even though, I hate such shitty word processing jobs, I didn't want to disappoint him. So I went there to help him.

      I'm not good at word processing and also I'm not comfortable with MS Word. I normally keep my trust on OpenOffice for small word processing needs of mine. So I had to work on the matter with the little experience I had. At that moment it looked tables were the only solution. I knew it's not the way, but my vanity kept me away from exploring all menu options and shooting the help. So the end-result was waste of time for a less usable layout. Anyway he was happy with it, but I know it will last only until he want to amend content of a cell.

      After I returned home I took some time and found that column layout models are one of the built-in functionalities of the word processor. It was just the matter of clicking a button and selecting the right layout. I feel like I should have followed one of those boring courses on Basic Office packages :) But I don't think Sri Lankan institutes do that even right, what they teach is Microsof Office. They think word processing could be done only on MS Word and Spreadsheets are only for MS Excel (How many of you know about Google Office ?)

      So what I learned from this ? I think semantic design should be practised at everywhere whenever it's possible. Use tables when only you want to display tabular data. Also I have seen lot of people use spaces and tabs to control indention but for that too there are suitable tools on a Word Processor.

      Practice semantic design in your word processing, it will save you the hassle of fixing the layout every time you make a change to the document.

      ]]>
      What was the first web site you visited ? http://laktek.com/2006/10/29/what-was-the-first-web-site-you-visited http://laktek.com/2006/10/29/what-was-the-first-web-site-you-visited/#comments Sat, 28 Oct 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/10/29/what-was-the-first-web-site-you-visited Insipired by the post of Sam I also tried to recollect the memories of early internet experiences. The first web site I visited was www.cricket.org, still I visit the site daily which is now more popular now as cricinfo.com. Connecting to internet using Lanka Internet Prepaid Card with a 28k modem was funny in the current context, but that little experience let me to a whole new world of opportunities.

      Anyway can you remember what was the first site you visited ?

      ]]>
      Hacking made easy with Google Code Search http://laktek.com/2006/10/10/hacking-made-easy-with-google-code-search http://laktek.com/2006/10/10/hacking-made-easy-with-google-code-search/#comments Mon, 09 Oct 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/10/10/hacking-made-easy-with-google-code-search I was so bored while studing for the exams and just hit on this new Google Code Search. Oh! do you know what I was able to extract with several keyword combinations ? Whole bunch of FTP access and Database connection credentials of sites which are currently live on the net. Those credentials seems to be valid and I didn't want to mess on those sites, so kept them as it is. It's damn simple and even a 3 year old can be a hacker with this.

      Anyway folks, better keep your sites secure from the Google Code Spider. From what I read it seems to index all files, including the files in compressed archives found in the public directory. So don't leave your important code there.

      ]]>
      Will Google come to Sri Lanka ? http://laktek.com/2006/09/21/will-google-come-to-sri-lanka http://laktek.com/2006/09/21/will-google-come-to-sri-lanka/#comments Wed, 20 Sep 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/09/21/will-google-come-to-sri-lanka Today I came across this Google Job Openings page which lists four job openings in South Asia, that includes a position for a Sri Lanka Country Consultant. For these positions Google doesn't mention an exact location, only mentions This position is located in South Asia. This sounds like Google is going to open up a new South Asian Regional office apart from the ones they have in India. What is more exciting is opening up a position for country consultant only in Sri Lanka (not in Pakistan or Bangladesh). Is this signify a possibility of seeing a Google office in Sri Lanka ?

      Sri Lanka was noticeable to Google than any other SA country in recent past, thanks to events like Google SoC and ApacheCon. Who knows that they may see Sri Lanka as a potential market ? There are already local SME's using Google Adwords and more importantly Google may see huge market for their future products such as Web Office and Google OS in developing countries like Sri Lanka.

      So how important for us to have Google in here? There will be so many benefits such as the infrastructure development and the recognition. Also, Google is a major supporter for the FOSS community and FOSS will be the key for countries like us in the coming future. So having a Google representation will be a catalyst for local FOSS community as well. Let's hope Google opens an office in Sri Lanka and many Sri Lankan Engineers will get to work in there....

      ]]>
      They know where you've been...... http://laktek.com/2006/09/01/they-know-where-youve-been http://laktek.com/2006/09/01/they-know-where-youve-been/#comments Thu, 31 Aug 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/09/01/they-know-where-youve-been I stumbled on this critical exploit (or rather smart technique) while I was visiting "How Web2.0 - Aware are you ??" They calculated a percentage based on the sites I have visited out of the 42 sites that were on their list. So how did they knew I visited these site ? Yeah they sniff my browser history !!

      Actually this is a one lame example for practical usage of this new way of trackking users browser habits, which came into the play few days ago with Jeremiah Grossman's blog. However this bug is tracked in Mozilla before 4 years ago and kept there without getting caught to someone's eyes.

      So how do they do it ? This hack is a combination of CSS with JavaScript. Remember there is CSS pseudoclass called a:visited, which makes visited links appear in a different colour (or style) ? They use JavaScript to walk through set of hyperlinks and check whether the each link's style is matched with the a:visited's style. So making a large list of links which is hidden from user's screen is possible with this. Using this technique, one could check whether you've come to their site after visiting their potential competitors and use that to provide better surfing experience. Some sources say this technique is already used by many e-commerce sites. But what if it's gone to the evil hands ? Online Blackmailing ??

      So how could you get over this ? There are several ways One easyway is to set history remembering in the browser to 0 days. Also you could disable JavaScript in the browser so it will stop the execution of such script. Meanwhile I found two firefox extensions that could be used to fix this issue.

      P.S. - It seems like Opera browser doesn't get affected by this exploit.

      ]]>
      Lets geotag Flickr photos http://laktek.com/2006/08/30/lets-geotag-flickr-photos http://laktek.com/2006/08/30/lets-geotag-flickr-photos/#comments Tue, 29 Aug 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/08/30/lets-geotag-flickr-photos Two days ago Flickr has added another awesome feature to its service, drag and drop geotagging. Integrated with Yahoo maps this allows you to go down to the street levels in any part of the world. Yeah ! you can find many major cities in Sri Lanka too!

      All you have to do is goto flickr and click on the drop-down Explore tab and select 'Photos on a map' then just type in the location you like (for example, Colombo or Anuradhapura). Zoom into the location and click on the hotspots to see the pictures tagged with that location.

      GREAT! Now probably you are all excited on how to get your photos on the map too. That's also quite easy, go to your account and select orgainze tab. In that screen select the tab Map, there you go! You just have to drop your photos to the location it was taken or you want to geotag. You could find the location easily by using the search tab and zoom in/out accordingly. All these would not take 15 seconds (Of course not on dial-up :P). If you need to learn more about this checkout the screencasts and blog post at flickr.

      So this service leads to many new possibilites. One thing I'm looking forward to see an integration with this and Google earth. Any other cool ideas ?

      ]]>
      Sri Lanka becoming a FOSS giant in Asia ? http://laktek.com/2006/08/04/sri-lanka-becoming-a-foss-giant-in-asia http://laktek.com/2006/08/04/sri-lanka-becoming-a-foss-giant-in-asia/#comments Thu, 03 Aug 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/08/04/sri-lanka-becoming-a-foss-giant-in-asia Recently Google has released a report with the statistics of the Google Summer of Code 2006 (FYI : It's a program where Google pay for students for working on various open-source projects). According to this report Sri Lanka comes under top 20 countries with highest accepted students and highest number of applications. It seems 10 projects out of 75 projects from Sri Lanka has been accepted.

      I think this is great because apart from India and China, Sri Lanka is the only other Asian country to be in the top 20. This is a clear sign of Sri Lanka is rising as a major player in the world of FOSS initiatives. It's clear that the world recognised FOSS projects such as Apache Axis and Sahana, which were developed in the island holds a major responsibility for this success.

      This could be a beginning of a new wave in Sri Lankan IT industry plus economy. FOSS projects may open up many opportunities, giving international exposure to our developers and recognition.

      ]]>
      eSriLanka PC - how it will affect ? http://laktek.com/2006/07/10/esrilanka-pc-how-it-will-affect http://laktek.com/2006/07/10/esrilanka-pc-how-it-will-affect/#comments Sun, 09 Jul 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/07/10/esrilanka-pc-how-it-will-affect Remember all flames we had with Intel CEO Craig Barrett labeling that OLPC's $100 laptop project as just a gadget ? Even though $100 laptop is still in prototype, intel had made a step ahead by sponsoring the eSriLankaPC.

      eSriLankaPC is a project under ICTA which has the object of increasing the PC adoption in country. As Mr.Barret wanted this is a fully functional PC based on Intel® architecture. PC comes with Pre-Installed software and educational programmes. Desktop environment will be based on Linux. You can take home the PC and pay in monthly installments. The lowest model costs Rs.32,300 and you have to pay Rs.1140 monthly. Also they are ready to offer trilingual Help Desk.

      Now the big question is will this project genuinely benefit the country and citizens ? It may be rather too early to go for any conclusions, But will this make a big impact on country's IT literacy ? or will it just be another trap of multinational money machine ?

      Price of this PC is still the same with what's on the bazaar. So there isn't a huge advantage over that. Considering the fact of average monthly household income in Sri Lanka is around Rs.12,000 and current cost of living will a average family is ready to spend Rs.1140 every month ?

      One of the biggest plus points of this project that i see is the huge rollout of FOSS in the country. Actually I think it will be the major difference of this PC project and what is currently on offer. Since the users will not get pirated copy of Windows pre-installed they will adopt to the Linux based Open Source environment (if not they knows how to format/partition and install pirated windows). Will it signify a new era with FOSS being the mainstream ?

      ]]>
      Home Office, anyone? http://laktek.com/2006/06/18/home-office-anyone http://laktek.com/2006/06/18/home-office-anyone/#comments Sat, 17 Jun 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/06/18/home-office-anyone The life in this island is getting too difficult. The rising flames of the war and ever increasing oil charges are screwing the people. Today, coming to Colombo has become the worst nightmare to anyone. The number of checkpoints and barriers has doubled the traffic jam and roughly it takes two hours to travel 20kms. Even if you could afford this journey by paying 100 bucks for 1 ltr of petrol, what is the use when you have to park your vehicle 10 kms away from your office due to security reasons? (guys at WTC should know this better) Public transport was always a chaos from the past and today people who opt to travel in a bus or in a train should have real guts. Every little noise excites the passengers and if you are to carry big baggage you are always under suspicion (not only that, today pregnant mothers are also under suspicion.)

      With the current situation I don't think many employees who work in the capital have the peace of mind to concentrate purely on their work. They always have to work under doubt and extra pressure. This situation will decrease the efficiency and cut the room for any innovation. Our trained and qualified workforce is not that plenty and if we are to loose a single life, it's a major lost to the whole country. That's why terrorists keep attacking on civilian workforce, thus break the backbone of country's economy.

      In this situation I personally think home office concept should be put into practice. Currently I think we have enough infrastructures to do so. Home PC with necessary software and ADSL line would be adequate for most of the service based work and also we could get the best use of free online tools such as Basecamp, Skype, Google Calendar, etc.

      Hope our CEOs will have a consideration on this and move the country forward tactfully in this gloomy period.

      ]]>
      Becoming a Semi Geek... http://laktek.com/2006/06/11/moving-into-new-grounds http://laktek.com/2006/06/11/moving-into-new-grounds/#comments Sat, 10 Jun 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/06/11/moving-into-new-grounds These days I'm undergoing a flip and turn in my life with the admission into University education. Now I'm spending most of the time in meatspace rather than in siliconspace. In this part of world, students still have more faith on Kuppi(local university slang for group studies) rather than O'Reilly books or Wikipedia, and the more healthy relationships you have in meatspace will assure your success in here.

      During the last two weeks, I learnt to reap the best out of campus life I should live in it. Traveling daily from home will not only exhaust me but also prevent me from gaining the best out of this life. However the problem is I don't want to loose my geek life that I used to live for ages. I would still love to hang on the net at the wee hours learning about RoR Unit Testing, experimenting Python scripts on my Symbian Phone and etc. So now I'm thinking of living the semi-geek life.

      My plan is to move to a room near campus with a Refurbished Laptop which would have a WLAN card plugged into the USB port. I'm doing this with a slight hope of connecting to the Wireless Network at the campus. (Do any guys at Mora have tried this before ?). Since I don't think I cannot survive with internet I'm also hoping to buy a CDMA phone as an alternative.

      ]]>
      Stepping into the Higher Studies http://laktek.com/2006/06/04/stepping-into-the-higher-studies http://laktek.com/2006/06/04/stepping-into-the-higher-studies/#comments Sat, 03 Jun 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/06/04/stepping-into-the-higher-studies Last week I began the chapter 2 of my life, after going through 13 exciting years of college education, I finally entered into higher education. I got selected as an internal student of IT Faculty of University of Moratuwa (UoM).

      I regard this as a great opportunity, since this course is considered more job oriented and updated course (Also saying that you are a undergrad at UoM is a pride itself). Another thing that excites me is the opportunities you get in this University. Last week the team from our faculty won the local competition in Microsoft Imagine Cup for the second consecutive year and there are many students from our faculty working on projects for Google Summer of Code. See how exciting it is !

      University enivronment is quite different. People, culture and traditions are all new and getting used to it is not that easy. At times it's fun and challenging. I love this both fun and challenge.

      ]]>
      Adding Power to the Backend http://laktek.com/2006/05/28/adding-power-to-the-backend http://laktek.com/2006/05/28/adding-power-to-the-backend/#comments Sat, 27 May 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/05/28/adding-power-to-the-backend Web 2.0 is all about Web Applications; Blogs, Wikis, Discussion Groups and other custom made web apps have made the life easy. Web Application development has it's own recipe. You need to depend on XHTML, CSS and JavaScript for front-end, but for the backend you have the option of selecting a scripting language or a framework of your choice. This process is more important as the whole business logic of your app relies on this. Getting the right tool involves both knowledge and experience.

      Choosing the Right Tool

      When choosing a language or a framework for a project you will need to look into certain aspects such as :

      1. How easy to achieve the expected goals. (You don't want to code your blogging system in C, aren't you ?)
      2. The Costs For Development, for Hosting and for Deployment (Best example is the huge popularity of PHP.)
      3. Backward Compatibility (Bloggers will know how easy it was to upgrade from Wordpress 1.x from 2.0)
      4. Availability of Resources & Code Libraries, Samples, Support Communities, etc.
      5. Complexity - If you have to spend most of development time on learning system and correcting the syntax, you haven't picked the right tool for you job. (That's why I personally prefer Ruby on Rails and Python over PHP.)

      Tools for the job.

      Here I will be putting up a list of programming languages and frameworks that’s can be used in server-side development. If I have missed anything let me know, I’m trying to make this list a more complete resource.

      • Ruby on Rails (RoR) 37Signals invented RoR as an ideal Web Framework; it showed its power by being the backend for Basecamp, Backpack and Campfire. RAILS is a framework which was coded in Ruby. It follows the MVC (Model, Views and Control) architecture. I personally like RoR for it's simplicity in language and automation of tedious tasks most developers love to avoid (Form Validation, XML parsing, etc.) The Best thing about RoR is it’s Open Source. If you're excited about this, I will write more posts on RoR in future, meanwhile you could visit www.rubyonrails.com for more info.
      • Python Even Google(need better example than it ?) use python for most of their applications, eventhough it’s not only a web programming language (even NASA is using it ;)). I like Python over PHP for it’s human-readable code syntax and for module based structure. It combines the power of low-level language such as C with high-end capabilities.
      • PHP (Hypertext Preprocessor) This is the most popular server side scripting language today; it is also an Open Source Project. Most of the web applications are built on this including popular packages such as Wordpress, Drupal, Mambo, etc. Most of the web hosts supports PHP, also you can find many online resources for PHP.
      • Java (J2EE) Java is been a part and parcel of internet development from the early days and it has become a ideal solution for e-business. Sun’s Java Enterprise platform includes JavaServer Pages and various other Web Services which makes it a great internet development tool. The low cost deployment and multi-platform architecture, gives J2EE a clear lead over Microsoft’s dotnet framework.
      • .NET Framework If you can afford to use it (and if you still have love them) Microsoft’s dotnet framework is one of the most intelligent systems to be used as web backend. ASP.net with ADO.net provides the backbone for development. Recent release of dotnet express editions enabled small scale developers to get a glimpse of it. Still dotnet hosting charges are bit high.
      • ASP 3.0 This became very popular in 90’s but today it’s been replaced by ASP.net, still I have seen some local developers using it.
      • Cold Fusion Cold Fusion was also another web scripting format which also includes a tag based markup. Macromedia bought it from it’s first owner Allaire. Powerful language with lot of features (Many bulitin features). Lack of web hosts and resources makes it difficult for developers.
      • CGI – Perl/ C CGI based Perl scripting was the first generation of Server-Side scripting. Compared to the Server-Side scripting language available today, Perl had a very complex syntax, which was pretty difficult to read. Most of the hosting packages supports CGI and you could easily find scripts such as Form Mailers which could be still handy even in a basic web site.
      If you picked up the correct tool then almost 50% of your project of is done. Viola!

      ]]>
      What to blog and what to not ? http://laktek.com/2006/05/26/what-to-blog-and-what-to-not http://laktek.com/2006/05/26/what-to-blog-and-what-to-not/#comments Thu, 25 May 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/05/26/what-to-blog-and-what-to-not Blogging is a unique art. Bloggers are not poets or Technical Writers and not even News Reporters. Bloggers are bloggers and blog post should feel like a blog post ! Even though Blogs are reffered as a free media with no bounds, I feel it’s always quite trivial to decide what to blog and what to not!

      I know some people do blog just about anything and some do succeed by doing it too. But to be honest it’s always been a twister for me to decide what should go in the blog, (maybe because I just entered to blogsphere and Stage Fever ??) However I keep worrying whether my content do make sense to others? Whether it will help someone in someway? Or is this subject blogged better than this elsewhere?

      For now I don’t have a high expectations on blogging (such as getting into Top 100 of Technorati or Receiving a $10k+ check from AdWords) and I think keeping away from obsessions will make blogging more free and fun. Knowing that I’m not Jeffrey Zeldman or Ray Ozzie and I should not always have to be spot on gives me a bit relief.

      I enjoy blogging, it will help to keep me busy in my idle hours and helps me to increase my caffeine intake! I know if I master the art, I could be a guru too, until then there is a big learning curve. Finally, I doubt whether this is worthy to get blogged, anyway I’m pushing PUBLISH!

      ]]>
      Why Sri Lankan designers should adapt to Web Standards ? http://laktek.com/2006/05/25/why-sri-lankan-designers-should-adapt-to-web-standards http://laktek.com/2006/05/25/why-sri-lankan-designers-should-adapt-to-web-standards/#comments Wed, 24 May 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/05/25/why-sri-lankan-designers-should-adapt-to-web-standards It seems most of the web designers in the world are adapting to web standards and many major sites are redesigning (realigning or rebooting or what ever) to comply with the web standards, but the interest of the subject here in Sri Lanka is very poor. Most of the sites, whether it government or commercial, are still stuck with the table based or totally flash based layouts. Forget about existing sites, not even the new sites follow web standards.

      You don't need require High-end hardware or any sophisticated software to create web standard based designs (that's why I like it most) and you it's not rocket science to learn and most of the learning resources are free !(actually I haven't spend a cent to learn on web standards). I think there are few reasons which keep developers away from web standards, such as:

      • They are still in love with their WYSIWYG editor. (Remember the first day you touched web designing with notepad ?)
      • Still thinks CSS is only a way to make hover effects on hyperlinks.
      • If there isn't anything rotate, scroll, flare or jump or hop in the screen they think their friends (or Boss !) may not call it cool.
      • They still use IE 5 and thinks how cool it's ActiveX controls.
      • They get their pay-check end of the month without much fuss !
      There may be more but these are what only come to my mind right now.

      So if belong to above category then why should you adapt to Web Standards?

      1. Most of the surfers still are on dialup and they have to pay for every second they are online. With web standards it's quick to render in browser and users could see your web page on the fly.
      2. Thanks to Sinhala and Tamil Unicode localized content is getting popular and there will be more ask for these in the coming years. By separating content from presentation you can serve one document in many ways, suiting the readers' language, culture and region etc. Read this to get a better idea
      3. There are people who are surfing the web through their mobile phones. Since GPRS users are paying for Kilobytes they will praise you if the site is compact and they would not like to scroll all over the site to find the navigation bar. With Web Standards you can present your same page compact and styled specifically for Mobile.
      4. Disabled People also have their rights to acquire information, so your site should be accessible for them too. Read this article it will explain it to you better than me
      5. If you are willing to earn some foreign currency to the country you should know web standards. If you like to bid for an out-sourced project knowing web standards will be a must, because as I told you earlier world is using web standards and without it they may not call you a web designer.
      Those are five reasons what I think applies to the Sri Lankan context, there may be more. Here Roger Johansson lists 10 reasons why to use Web Standards. (An old article but still applicable)

      ]]>
      Share Alike Sri Lanka http://laktek.com/2006/05/23/share-alike-sri-lanka http://laktek.com/2006/05/23/share-alike-sri-lanka/#comments Mon, 22 May 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/05/23/share-alike-sri-lanka Sri Lankan geek community has been all excited for these days with the arrivel of Prof. Lawrence Lessig founder of Creative Commons. Eventhough I was not able to attend for the conference for unforseen circumstances, I'm keeping track of the event thanks to some hardworking bloggers and flickers (unfortunately no podcasters !)

      CreativeCommons.lk Launched ??

      Eventhough several blogs claimed that, ICTA launched the Sri Lankan version of the Creative Commons web site, still I couldn't access it. Maybe the server is crammed with all Sri Lankan Artists, Musicians, Writers claiming their Creative Commons License Lol! (I get the smell of another "Ruwanwelisaya type" Web Launch !)

      And, To Celebrate this Great Occasion...

      I also decided to make my blog and it's future posts available under creative commons license. So if you like the theme or content in my blog you can make use of it as long as you use it for non-commericial purposes.

      UPDATE : Creative Commons site seem to be working now, but another dissapointing table based design. Atleast if the designers had a look at their mother site they would have turned up with something better.

      ]]>
      Ayubowan http://laktek.com/2006/05/20/ayubowan http://laktek.com/2006/05/20/ayubowan/#comments Fri, 19 May 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/05/20/ayubowan Rather than saying Hello World, I prefer to welcome you to my blog saying Ayubowan ! (FYI : It's the way of saying welcome in Sinhalese).

      It seems like I'm the last person of this world to enter blogsphere, eventhough it's a late start I hope I could deliver something good (actually I did a blog that nobody read at blogspot for couple of months). Let's see how this goes.

      My passion is on web techologies and therefore you will find most of the posts here related to design, development, web standards, AJAX and all other web++. I will try to make this blog meaningful for all rather than filling it with Geek Stuff.

      Also please note this blog is still in BETA stage and you may find few hiccups. Talking about the blog design I wanted to have a clean Sri Lankan style layout (yes, I got inspired by Nidahas) and I like to hear your reviews.

      ]]>
      Ayubowan http://laktek.com/2006/05/20/ayubowan http://laktek.com/2006/05/20/ayubowan/#comments Fri, 19 May 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/05/20/ayubowan Rather than saying Hello World, I prefer to welcome you to my blog saying Ayubowan ! (FYI : It's the way of saying welcome in Sinhalese).

      It seems like I'm the last person of this world to enter blogsphere, eventhough it's a late start I hope I could deliver something good (actually I did a blog that nobody read at blogspot for couple of months). Let's see how this goes.

      My passion is on web techologies and therefore you will find most of the posts here related to design, development, web standards, AJAX and all other web++. I will try to make this blog meaningful for all rather than filling it with Geek Stuff.

      Also please note this blog is still in BETA stage and you may find few hiccups. Talking about the blog design I wanted to have a clean Sri Lankan style layout (yes, I got inspired by Nidahas) and I like to hear your reviews.

      ]]>
      Share Alike Sri Lanka http://laktek.com/2006/05/23/share-alike-sri-lanka http://laktek.com/2006/05/23/share-alike-sri-lanka/#comments Mon, 22 May 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/05/23/share-alike-sri-lanka Sri Lankan geek community has been all excited for these days with the arrivel of Prof. Lawrence Lessig founder of Creative Commons. Eventhough I was not able to attend for the conference for unforseen circumstances, I'm keeping track of the event thanks to some hardworking bloggers and flickers (unfortunately no podcasters !)

      CreativeCommons.lk Launched ??

      Eventhough several blogs claimed that, ICTA launched the Sri Lankan version of the Creative Commons web site, still I couldn't access it. Maybe the server is crammed with all Sri Lankan Artists, Musicians, Writers claiming their Creative Commons License Lol! (I get the smell of another "Ruwanwelisaya type" Web Launch !)

      And, To Celebrate this Great Occasion...

      I also decided to make my blog and it's future posts available under creative commons license. So if you like the theme or content in my blog you can make use of it as long as you use it for non-commericial purposes.

      UPDATE : Creative Commons site seem to be working now, but another dissapointing table based design. Atleast if the designers had a look at their mother site they would have turned up with something better.

      ]]>
      Why Sri Lankan designers should adapt to Web Standards ? http://laktek.com/2006/05/25/why-sri-lankan-designers-should-adapt-to-web-standards http://laktek.com/2006/05/25/why-sri-lankan-designers-should-adapt-to-web-standards/#comments Wed, 24 May 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/05/25/why-sri-lankan-designers-should-adapt-to-web-standards It seems most of the web designers in the world are adapting to web standards and many major sites are redesigning (realigning or rebooting or what ever) to comply with the web standards, but the interest of the subject here in Sri Lanka is very poor. Most of the sites, whether it government or commercial, are still stuck with the table based or totally flash based layouts. Forget about existing sites, not even the new sites follow web standards.

      You don't need require High-end hardware or any sophisticated software to create web standard based designs (that's why I like it most) and you it's not rocket science to learn and most of the learning resources are free !(actually I haven't spend a cent to learn on web standards). I think there are few reasons which keep developers away from web standards, such as:

      • They are still in love with their WYSIWYG editor. (Remember the first day you touched web designing with notepad ?)
      • Still thinks CSS is only a way to make hover effects on hyperlinks.
      • If there isn't anything rotate, scroll, flare or jump or hop in the screen they think their friends (or Boss !) may not call it cool.
      • They still use IE 5 and thinks how cool it's ActiveX controls.
      • They get their pay-check end of the month without much fuss !
      There may be more but these are what only come to my mind right now.

      So if belong to above category then why should you adapt to Web Standards?

      1. Most of the surfers still are on dialup and they have to pay for every second they are online. With web standards it's quick to render in browser and users could see your web page on the fly.
      2. Thanks to Sinhala and Tamil Unicode localized content is getting popular and there will be more ask for these in the coming years. By separating content from presentation you can serve one document in many ways, suiting the readers' language, culture and region etc. Read this to get a better idea
      3. There are people who are surfing the web through their mobile phones. Since GPRS users are paying for Kilobytes they will praise you if the site is compact and they would not like to scroll all over the site to find the navigation bar. With Web Standards you can present your same page compact and styled specifically for Mobile.
      4. Disabled People also have their rights to acquire information, so your site should be accessible for them too. Read this article it will explain it to you better than me
      5. If you are willing to earn some foreign currency to the country you should know web standards. If you like to bid for an out-sourced project knowing web standards will be a must, because as I told you earlier world is using web standards and without it they may not call you a web designer.
      Those are five reasons what I think applies to the Sri Lankan context, there may be more. Here Roger Johansson lists 10 reasons why to use Web Standards. (An old article but still applicable)

      ]]>
      What to blog and what to not ? http://laktek.com/2006/05/26/what-to-blog-and-what-to-not http://laktek.com/2006/05/26/what-to-blog-and-what-to-not/#comments Thu, 25 May 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/05/26/what-to-blog-and-what-to-not Blogging is a unique art. Bloggers are not poets or Technical Writers and not even News Reporters. Bloggers are bloggers and blog post should feel like a blog post ! Even though Blogs are reffered as a free media with no bounds, I feel it’s always quite trivial to decide what to blog and what to not!

      I know some people do blog just about anything and some do succeed by doing it too. But to be honest it’s always been a twister for me to decide what should go in the blog, (maybe because I just entered to blogsphere and Stage Fever ??) However I keep worrying whether my content do make sense to others? Whether it will help someone in someway? Or is this subject blogged better than this elsewhere?

      For now I don’t have a high expectations on blogging (such as getting into Top 100 of Technorati or Receiving a $10k+ check from AdWords) and I think keeping away from obsessions will make blogging more free and fun. Knowing that I’m not Jeffrey Zeldman or Ray Ozzie and I should not always have to be spot on gives me a bit relief.

      I enjoy blogging, it will help to keep me busy in my idle hours and helps me to increase my caffeine intake! I know if I master the art, I could be a guru too, until then there is a big learning curve. Finally, I doubt whether this is worthy to get blogged, anyway I’m pushing PUBLISH!

      ]]>
      Adding Power to the Backend http://laktek.com/2006/05/28/adding-power-to-the-backend http://laktek.com/2006/05/28/adding-power-to-the-backend/#comments Sat, 27 May 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/05/28/adding-power-to-the-backend Web 2.0 is all about Web Applications; Blogs, Wikis, Discussion Groups and other custom made web apps have made the life easy. Web Application development has it's own recipe. You need to depend on XHTML, CSS and JavaScript for front-end, but for the backend you have the option of selecting a scripting language or a framework of your choice. This process is more important as the whole business logic of your app relies on this. Getting the right tool involves both knowledge and experience.

      Choosing the Right Tool

      When choosing a language or a framework for a project you will need to look into certain aspects such as :

      1. How easy to achieve the expected goals. (You don't want to code your blogging system in C, aren't you ?)
      2. The Costs For Development, for Hosting and for Deployment (Best example is the huge popularity of PHP.)
      3. Backward Compatibility (Bloggers will know how easy it was to upgrade from Wordpress 1.x from 2.0)
      4. Availability of Resources & Code Libraries, Samples, Support Communities, etc.
      5. Complexity - If you have to spend most of development time on learning system and correcting the syntax, you haven't picked the right tool for you job. (That's why I personally prefer Ruby on Rails and Python over PHP.)

      Tools for the job.

      Here I will be putting up a list of programming languages and frameworks that’s can be used in server-side development. If I have missed anything let me know, I’m trying to make this list a more complete resource.

      • Ruby on Rails (RoR) 37Signals invented RoR as an ideal Web Framework; it showed its power by being the backend for Basecamp, Backpack and Campfire. RAILS is a framework which was coded in Ruby. It follows the MVC (Model, Views and Control) architecture. I personally like RoR for it's simplicity in language and automation of tedious tasks most developers love to avoid (Form Validation, XML parsing, etc.) The Best thing about RoR is it’s Open Source. If you're excited about this, I will write more posts on RoR in future, meanwhile you could visit www.rubyonrails.com for more info.
      • Python Even Google(need better example than it ?) use python for most of their applications, eventhough it’s not only a web programming language (even NASA is using it ;)). I like Python over PHP for it’s human-readable code syntax and for module based structure. It combines the power of low-level language such as C with high-end capabilities.
      • PHP (Hypertext Preprocessor) This is the most popular server side scripting language today; it is also an Open Source Project. Most of the web applications are built on this including popular packages such as Wordpress, Drupal, Mambo, etc. Most of the web hosts supports PHP, also you can find many online resources for PHP.
      • Java (J2EE) Java is been a part and parcel of internet development from the early days and it has become a ideal solution for e-business. Sun’s Java Enterprise platform includes JavaServer Pages and various other Web Services which makes it a great internet development tool. The low cost deployment and multi-platform architecture, gives J2EE a clear lead over Microsoft’s dotnet framework.
      • .NET Framework If you can afford to use it (and if you still have love them) Microsoft’s dotnet framework is one of the most intelligent systems to be used as web backend. ASP.net with ADO.net provides the backbone for development. Recent release of dotnet express editions enabled small scale developers to get a glimpse of it. Still dotnet hosting charges are bit high.
      • ASP 3.0 This became very popular in 90’s but today it’s been replaced by ASP.net, still I have seen some local developers using it.
      • Cold Fusion Cold Fusion was also another web scripting format which also includes a tag based markup. Macromedia bought it from it’s first owner Allaire. Powerful language with lot of features (Many bulitin features). Lack of web hosts and resources makes it difficult for developers.
      • CGI – Perl/ C CGI based Perl scripting was the first generation of Server-Side scripting. Compared to the Server-Side scripting language available today, Perl had a very complex syntax, which was pretty difficult to read. Most of the hosting packages supports CGI and you could easily find scripts such as Form Mailers which could be still handy even in a basic web site.
      If you picked up the correct tool then almost 50% of your project of is done. Viola!

      ]]>
      Stepping into the Higher Studies http://laktek.com/2006/06/04/stepping-into-the-higher-studies http://laktek.com/2006/06/04/stepping-into-the-higher-studies/#comments Sat, 03 Jun 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/06/04/stepping-into-the-higher-studies Last week I began the chapter 2 of my life, after going through 13 exciting years of college education, I finally entered into higher education. I got selected as an internal student of IT Faculty of University of Moratuwa (UoM).

      I regard this as a great opportunity, since this course is considered more job oriented and updated course (Also saying that you are a undergrad at UoM is a pride itself). Another thing that excites me is the opportunities you get in this University. Last week the team from our faculty won the local competition in Microsoft Imagine Cup for the second consecutive year and there are many students from our faculty working on projects for Google Summer of Code. See how exciting it is !

      University enivronment is quite different. People, culture and traditions are all new and getting used to it is not that easy. At times it's fun and challenging. I love this both fun and challenge.

      ]]>
      Becoming a Semi Geek... http://laktek.com/2006/06/11/moving-into-new-grounds http://laktek.com/2006/06/11/moving-into-new-grounds/#comments Sat, 10 Jun 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/06/11/moving-into-new-grounds These days I'm undergoing a flip and turn in my life with the admission into University education. Now I'm spending most of the time in meatspace rather than in siliconspace. In this part of world, students still have more faith on Kuppi(local university slang for group studies) rather than O'Reilly books or Wikipedia, and the more healthy relationships you have in meatspace will assure your success in here.

      During the last two weeks, I learnt to reap the best out of campus life I should live in it. Traveling daily from home will not only exhaust me but also prevent me from gaining the best out of this life. However the problem is I don't want to loose my geek life that I used to live for ages. I would still love to hang on the net at the wee hours learning about RoR Unit Testing, experimenting Python scripts on my Symbian Phone and etc. So now I'm thinking of living the semi-geek life.

      My plan is to move to a room near campus with a Refurbished Laptop which would have a WLAN card plugged into the USB port. I'm doing this with a slight hope of connecting to the Wireless Network at the campus. (Do any guys at Mora have tried this before ?). Since I don't think I cannot survive with internet I'm also hoping to buy a CDMA phone as an alternative.

      ]]>
      Home Office, anyone? http://laktek.com/2006/06/18/home-office-anyone http://laktek.com/2006/06/18/home-office-anyone/#comments Sat, 17 Jun 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/06/18/home-office-anyone The life in this island is getting too difficult. The rising flames of the war and ever increasing oil charges are screwing the people. Today, coming to Colombo has become the worst nightmare to anyone. The number of checkpoints and barriers has doubled the traffic jam and roughly it takes two hours to travel 20kms. Even if you could afford this journey by paying 100 bucks for 1 ltr of petrol, what is the use when you have to park your vehicle 10 kms away from your office due to security reasons? (guys at WTC should know this better) Public transport was always a chaos from the past and today people who opt to travel in a bus or in a train should have real guts. Every little noise excites the passengers and if you are to carry big baggage you are always under suspicion (not only that, today pregnant mothers are also under suspicion.)

      With the current situation I don't think many employees who work in the capital have the peace of mind to concentrate purely on their work. They always have to work under doubt and extra pressure. This situation will decrease the efficiency and cut the room for any innovation. Our trained and qualified workforce is not that plenty and if we are to loose a single life, it's a major lost to the whole country. That's why terrorists keep attacking on civilian workforce, thus break the backbone of country's economy.

      In this situation I personally think home office concept should be put into practice. Currently I think we have enough infrastructures to do so. Home PC with necessary software and ADSL line would be adequate for most of the service based work and also we could get the best use of free online tools such as Basecamp, Skype, Google Calendar, etc.

      Hope our CEOs will have a consideration on this and move the country forward tactfully in this gloomy period.

      ]]>
      eSriLanka PC - how it will affect ? http://laktek.com/2006/07/10/esrilanka-pc-how-it-will-affect http://laktek.com/2006/07/10/esrilanka-pc-how-it-will-affect/#comments Sun, 09 Jul 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/07/10/esrilanka-pc-how-it-will-affect Remember all flames we had with Intel CEO Craig Barrett labeling that OLPC's $100 laptop project as just a gadget ? Even though $100 laptop is still in prototype, intel had made a step ahead by sponsoring the eSriLankaPC.

      eSriLankaPC is a project under ICTA which has the object of increasing the PC adoption in country. As Mr.Barret wanted this is a fully functional PC based on Intel® architecture. PC comes with Pre-Installed software and educational programmes. Desktop environment will be based on Linux. You can take home the PC and pay in monthly installments. The lowest model costs Rs.32,300 and you have to pay Rs.1140 monthly. Also they are ready to offer trilingual Help Desk.

      Now the big question is will this project genuinely benefit the country and citizens ? It may be rather too early to go for any conclusions, But will this make a big impact on country's IT literacy ? or will it just be another trap of multinational money machine ?

      Price of this PC is still the same with what's on the bazaar. So there isn't a huge advantage over that. Considering the fact of average monthly household income in Sri Lanka is around Rs.12,000 and current cost of living will a average family is ready to spend Rs.1140 every month ?

      One of the biggest plus points of this project that i see is the huge rollout of FOSS in the country. Actually I think it will be the major difference of this PC project and what is currently on offer. Since the users will not get pirated copy of Windows pre-installed they will adopt to the Linux based Open Source environment (if not they knows how to format/partition and install pirated windows). Will it signify a new era with FOSS being the mainstream ?

      ]]>
      Sri Lanka becoming a FOSS giant in Asia ? http://laktek.com/2006/08/04/sri-lanka-becoming-a-foss-giant-in-asia http://laktek.com/2006/08/04/sri-lanka-becoming-a-foss-giant-in-asia/#comments Thu, 03 Aug 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/08/04/sri-lanka-becoming-a-foss-giant-in-asia Recently Google has released a report with the statistics of the Google Summer of Code 2006 (FYI : It's a program where Google pay for students for working on various open-source projects). According to this report Sri Lanka comes under top 20 countries with highest accepted students and highest number of applications. It seems 10 projects out of 75 projects from Sri Lanka has been accepted.

      I think this is great because apart from India and China, Sri Lanka is the only other Asian country to be in the top 20. This is a clear sign of Sri Lanka is rising as a major player in the world of FOSS initiatives. It's clear that the world recognised FOSS projects such as Apache Axis and Sahana, which were developed in the island holds a major responsibility for this success.

      This could be a beginning of a new wave in Sri Lankan IT industry plus economy. FOSS projects may open up many opportunities, giving international exposure to our developers and recognition.

      ]]>
      Lets geotag Flickr photos http://laktek.com/2006/08/30/lets-geotag-flickr-photos http://laktek.com/2006/08/30/lets-geotag-flickr-photos/#comments Tue, 29 Aug 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/08/30/lets-geotag-flickr-photos Two days ago Flickr has added another awesome feature to its service, drag and drop geotagging. Integrated with Yahoo maps this allows you to go down to the street levels in any part of the world. Yeah ! you can find many major cities in Sri Lanka too!

      All you have to do is goto flickr and click on the drop-down Explore tab and select 'Photos on a map' then just type in the location you like (for example, Colombo or Anuradhapura). Zoom into the location and click on the hotspots to see the pictures tagged with that location.

      GREAT! Now probably you are all excited on how to get your photos on the map too. That's also quite easy, go to your account and select orgainze tab. In that screen select the tab Map, there you go! You just have to drop your photos to the location it was taken or you want to geotag. You could find the location easily by using the search tab and zoom in/out accordingly. All these would not take 15 seconds (Of course not on dial-up :P). If you need to learn more about this checkout the screencasts and blog post at flickr.

      So this service leads to many new possibilites. One thing I'm looking forward to see an integration with this and Google earth. Any other cool ideas ?

      ]]>
      They know where you've been...... http://laktek.com/2006/09/01/they-know-where-youve-been http://laktek.com/2006/09/01/they-know-where-youve-been/#comments Thu, 31 Aug 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/09/01/they-know-where-youve-been I stumbled on this critical exploit (or rather smart technique) while I was visiting "How Web2.0 - Aware are you ??" They calculated a percentage based on the sites I have visited out of the 42 sites that were on their list. So how did they knew I visited these site ? Yeah they sniff my browser history !!

      Actually this is a one lame example for practical usage of this new way of trackking users browser habits, which came into the play few days ago with Jeremiah Grossman's blog. However this bug is tracked in Mozilla before 4 years ago and kept there without getting caught to someone's eyes.

      So how do they do it ? This hack is a combination of CSS with JavaScript. Remember there is CSS pseudoclass called a:visited, which makes visited links appear in a different colour (or style) ? They use JavaScript to walk through set of hyperlinks and check whether the each link's style is matched with the a:visited's style. So making a large list of links which is hidden from user's screen is possible with this. Using this technique, one could check whether you've come to their site after visiting their potential competitors and use that to provide better surfing experience. Some sources say this technique is already used by many e-commerce sites. But what if it's gone to the evil hands ? Online Blackmailing ??

      So how could you get over this ? There are several ways One easyway is to set history remembering in the browser to 0 days. Also you could disable JavaScript in the browser so it will stop the execution of such script. Meanwhile I found two firefox extensions that could be used to fix this issue.

      P.S. - It seems like Opera browser doesn't get affected by this exploit.

      ]]>
      Will Google come to Sri Lanka ? http://laktek.com/2006/09/21/will-google-come-to-sri-lanka http://laktek.com/2006/09/21/will-google-come-to-sri-lanka/#comments Wed, 20 Sep 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/09/21/will-google-come-to-sri-lanka Today I came across this Google Job Openings page which lists four job openings in South Asia, that includes a position for a Sri Lanka Country Consultant. For these positions Google doesn't mention an exact location, only mentions This position is located in South Asia. This sounds like Google is going to open up a new South Asian Regional office apart from the ones they have in India. What is more exciting is opening up a position for country consultant only in Sri Lanka (not in Pakistan or Bangladesh). Is this signify a possibility of seeing a Google office in Sri Lanka ?

      Sri Lanka was noticeable to Google than any other SA country in recent past, thanks to events like Google SoC and ApacheCon. Who knows that they may see Sri Lanka as a potential market ? There are already local SME's using Google Adwords and more importantly Google may see huge market for their future products such as Web Office and Google OS in developing countries like Sri Lanka.

      So how important for us to have Google in here? There will be so many benefits such as the infrastructure development and the recognition. Also, Google is a major supporter for the FOSS community and FOSS will be the key for countries like us in the coming future. So having a Google representation will be a catalyst for local FOSS community as well. Let's hope Google opens an office in Sri Lanka and many Sri Lankan Engineers will get to work in there....

      ]]>
      Hacking made easy with Google Code Search http://laktek.com/2006/10/10/hacking-made-easy-with-google-code-search http://laktek.com/2006/10/10/hacking-made-easy-with-google-code-search/#comments Mon, 09 Oct 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/10/10/hacking-made-easy-with-google-code-search I was so bored while studing for the exams and just hit on this new Google Code Search. Oh! do you know what I was able to extract with several keyword combinations ? Whole bunch of FTP access and Database connection credentials of sites which are currently live on the net. Those credentials seems to be valid and I didn't want to mess on those sites, so kept them as it is. It's damn simple and even a 3 year old can be a hacker with this.

      Anyway folks, better keep your sites secure from the Google Code Spider. From what I read it seems to index all files, including the files in compressed archives found in the public directory. So don't leave your important code there.

      ]]>
      What was the first web site you visited ? http://laktek.com/2006/10/29/what-was-the-first-web-site-you-visited http://laktek.com/2006/10/29/what-was-the-first-web-site-you-visited/#comments Sat, 28 Oct 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/10/29/what-was-the-first-web-site-you-visited Insipired by the post of Sam I also tried to recollect the memories of early internet experiences. The first web site I visited was www.cricket.org, still I visit the site daily which is now more popular now as cricinfo.com. Connecting to internet using Lanka Internet Prepaid Card with a 28k modem was funny in the current context, but that little experience let me to a whole new world of opportunities.

      Anyway can you remember what was the first site you visited ?

      ]]>
      Semantics on Word Processing http://laktek.com/2006/11/05/semantics-on-word-processing http://laktek.com/2006/11/05/semantics-on-word-processing/#comments Sat, 04 Nov 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/11/05/semantics-on-word-processing As an adamant fan of web standards and semantically correct web design, I always tried to avoid using tables in web page layout design. But I never thought of taking that practice beyond web design.

      However today I got a call from a less-tech savy person, who wanted me to help him with a layout of a Word Document (yes it was M$ Word). He wanted to have a two column layout in a landscape A4 paper. Even though, I hate such shitty word processing jobs, I didn't want to disappoint him. So I went there to help him.

      I'm not good at word processing and also I'm not comfortable with MS Word. I normally keep my trust on OpenOffice for small word processing needs of mine. So I had to work on the matter with the little experience I had. At that moment it looked tables were the only solution. I knew it's not the way, but my vanity kept me away from exploring all menu options and shooting the help. So the end-result was waste of time for a less usable layout. Anyway he was happy with it, but I know it will last only until he want to amend content of a cell.

      After I returned home I took some time and found that column layout models are one of the built-in functionalities of the word processor. It was just the matter of clicking a button and selecting the right layout. I feel like I should have followed one of those boring courses on Basic Office packages :) But I don't think Sri Lankan institutes do that even right, what they teach is Microsof Office. They think word processing could be done only on MS Word and Spreadsheets are only for MS Excel (How many of you know about Google Office ?)

      So what I learned from this ? I think semantic design should be practised at everywhere whenever it's possible. Use tables when only you want to display tabular data. Also I have seen lot of people use spaces and tabs to control indention but for that too there are suitable tools on a Word Processor.

      Practice semantic design in your word processing, it will save you the hassle of fixing the layout every time you make a change to the document.

      ]]>
      Gmail realigns it's interface http://laktek.com/2006/11/10/gmail-realigns-its-interface http://laktek.com/2006/11/10/gmail-realigns-its-interface/#comments Thu, 09 Nov 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/11/10/gmail-realigns-its-interface It seems Gmail has also joined the rounded corner and drop shadow galore with it's upgrade to the interface. Even though there isn't a major upgrade in the main UI message interface has become much sweet.

      Also they have taken all the message actions to a top-right drop-down menu, which actually becomes quite irritating than the what they had earlier. However it seems google is going to have some become upgrades to Gmail. This may be just a tip of an ice berg.

      ]]>
      The tools I use http://laktek.com/2006/12/15/the-tools-i-use http://laktek.com/2006/12/15/the-tools-i-use/#comments Thu, 14 Dec 2006 16:00:00 GMT Lakshan Perera http://laktek.com/2006/12/15/the-tools-i-use Here is the list of tools I use for my day to day activities.

      • Machine : Compaq Presario v3000
      • OS : Ubuntu 6.10 (Primary OS), Windows XP Media Center (OEM version, keeping it for testing)
      • Editor : Scribes (It's becoming the TextMate for GNU ! awesome editor)
      • Browser : Firefox 2, Opera
      • CLI : Bash
      • Web Host : Dreamhost
      • Framework : Ruby on Rails
      • CMS/ Blog : Wordpress, Drupal
      • Javascript Library : Prototype
      • Debugger : Firebug (yes, the new beta rocks !)
      • Project Management : Active Collab
      • FTP Client : FileZilla
      • Version Control : Subversion
      • Office Suite : OpenOffice
      • Image Editing : Photoshop CS2 (Main reason I cannot get rid of my windows partition), Google Picasa, GIMP
      • Email Client : Gmail (Including my POP3 accounts)
      • IM : via GAIM (IRC, Google Talk and Yahoo)
      • Distance Communication : Skype
      • Music Player/Manager : Amarok
      • Torrent Client : uTorrent via Wine (I haven't seen a better torrent client than that)
      ]]>
      Keeping the focus http://laktek.com/2007/01/13/keeping-the-focus http://laktek.com/2007/01/13/keeping-the-focus/#comments Fri, 12 Jan 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/01/13/keeping-the-focus For the first time in this year I'm going to touch my blog. Since I started this blog back in last May, I got great exposure by coming to know various cool people and involving in great projects. But when I was driven away by more work, attention to blog has been simply neglected. Actually I think at times, I filled the blog with more noise than signal. This year I'm going to give more focus on my blog and I hope the blog will also mature with experience.

      Year 2006 was a great year for me personally. I got into the university for my higher studies and also able to have a steady progress as in my professional career as a web developer. I got the opportunity in working on some great freelance projects, which I did really enjoyed. Besides that, I had the opportunity to join Vesess as one of their developers. Vesess has the spirit of a new age web startup and it seems to be a place I could really enhance my career.

      So the challenge in 2007 is to move up on the ladder, keeping up the same momentum. 2006 was an year of establisment for me. I had fulfilled my needs, next step is to keep on the commitment. That would be much harder, but as always I'm ready for the challenge.

      ]]>
      LakTEK is joined with Blue Fish Network http://laktek.com/2007/01/21/laktek-is-joined-with-blue-fish-network http://laktek.com/2007/01/21/laktek-is-joined-with-blue-fish-network/#comments Sat, 20 Jan 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/01/21/laktek-is-joined-with-blue-fish-network One of my new year perspectives was to give more attention to my blog and serve it to much bigger audience. Today I was notified by Blue Fish Network that they have added my blog to their network. I see this as yet another opportunity to expand my blog to a wider audience.

      Blue Fish is a UK based global Blog network, which currently includes 67 members. These members muses on wide array of subjects. I've been included in Technology stream along with other cool folks like Nick Barrett, Kevin Sylvia and Francisco.

      So I take this opportunity to thank Andy Merrett and rest of the Blue Fish Network for accepting me to your community. Hoping to build up some great relationships with this opportunity.

      ]]>
      Early Bird view to Google Summer of Code Project Ideas http://laktek.com/2007/03/08/early-bird-view-to-google-summer-of-code-project-ideas http://laktek.com/2007/03/08/early-bird-view-to-google-summer-of-code-project-ideas/#comments Wed, 07 Mar 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/03/08/early-bird-view-to-google-summer-of-code-project-ideas As most of you may aware (or not) Google Summer of Code for 2007 has been announced. This year I'm eager to apply for a project, eventhough I'm not sure whetehr I will get accepted. Anyway project ideas of accepted mentoring organizations will be released only on 14th March and student application period will be ended on 24th of March. Since I (and many others too) haven't been actively involve in projects with these organizations, I thought better to begin preparations early and select a good project. Most of the organizations have included a wiki page of potential projects.

      I thought of sharing the list of project ideas I compiled. Note that final project ideas of these organizations may differ and also not all of these organizations may get accepted to GSoC .

      UPDATE : Lot of new project ideas, which I missed have been added in the comments section. Check them as well.

      ]]>
      Accepted for Google Summer of Code ! http://laktek.com/2007/04/12/accepted-to-google-summer-of-code http://laktek.com/2007/04/12/accepted-to-google-summer-of-code/#comments Wed, 11 Apr 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/04/12/accepted-to-google-summer-of-code Today is a one of the greatest days in my life, as I got accepted to Google Summer of Code (GSoC) 2007. It's a dream coming true for me and looking forward to have some great experiences with it.

      Google Summer of Code, is a program organized by Google for University Students. Where they have to work for Open Source Project for 2 months, which is mentored by an organization. This will give huge boost for Open Source community as well as splendid experience for budding up developers. This year 900 students were selected worldwide out of 6200 applications, to work for 100 mentoring organizations.

      I will be working for Silverstripe CMS and Framwork. Silverstripe is been a great project and its really a fun to work with such a community. Silverstripe is basically is like a cross between Drupal and Ruby on Rails(ROR), which would be an easy platform to develop any kind of site. Its a delight to work with stuff I love most such as MVC, AJAX, Web Standards, etc. I hope I could contribute for the success of Silverstripe and make it a perfect framework for web development. I will be updating this blog with the further technical details of my project and I will share the experiences I gain.

      Also its a great delight to see my fellow Vesessins, Laknath and Amila are also getting accepted. The inspiration got from Prabhath's last years experience with GSoC, also did helped us immensely. Now we have 4 of our team with GSOC experience. This will give the world a clear indication of the quality of Vesessins. I wish Good Luck for my buddies too ! So looking forward for exciting and challenging days ahead :)

      ]]>
      One step away from the ultimate title http://laktek.com/2007/04/25/one-step-away-from-the-ulitmate-title http://laktek.com/2007/04/25/one-step-away-from-the-ulitmate-title/#comments Tue, 24 Apr 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/04/25/one-step-away-from-the-ulitmate-title Whooo! They have done it again ! Mahela and crew are one step away in bringing glory back to this small island in the Indian Ocean. Congrats guys ! It's a marvelous display of sportsmanship from our beloved cricketers !

      Cricket is more than just a sport in Sri Lanka. It's a passion. Whole nation has gone crazy with our superb performances in this world cup. Simply now Sri Lankans have begun to eat cricket, drink cricket and live cricket !!! All will be praying to see a repeat performance of 1996 (or even better) on 28th of April.

      I was in 5th grade when Arjuna lifted the cup at Gadaffi Stadium, Lahore. 11 years have gone since that glory day and it seems the enthusiasm for the game of mine have never changed. Actually it had got better like Sanath's batting :) Eventhough I'm just a TV spectator, I would dance in ecstacy and weep in disappointment (like yesterday when Sanath threw his wicket away) like I'm there in the middle with them. I think it's not only me but all Sri Lankan fans do act in the same way. Now it's 6.30 a.m. in normal working day here in Sri Lanka and still I don't hear signle noise of a vehicle on the road (not even the noise of trains). Seems like whole nation is still relaxing and enjoying last night's great victory (match ended around 3.30 a.m in the morning) and today will kick-off rather late to Sri Lanka :)

      Also talking about our team, I think the current side is much stronger and balanced outfit than what we had on 1996 world cup. Sanath, Murali and Vaas have grown out of experience and die-hard efforts of young blood such as Tharanga, Malinga are quite impressive. Also I haven't seen such aggressive and tactical captain as Mahela in the game. Truly we play our brand of cricket right now and I guess that would be the difference at the end of the day.

      A little trivia to end this post. Marven Atapattu was a member of the both Sri Lankan squads in 1996 and this time, yet he haven't had a opportunity to play in a single match in either World Cups (not likely to play in the finals too). I know it's always happy to be a part of a winning outfit, but wonder how Atapattu may feeling this..

      ]]>
      City under attack ? http://laktek.com/2007/04/29/city-under-attack http://laktek.com/2007/04/29/city-under-attack/#comments Sat, 28 Apr 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/04/29/city-under-attack Update on the current situation in Colombo (exactly here in Wattala). Still nothing is yet to confirm and moments passed by were filled with fear and uncertainity.

      We got electricity back in few minutes ago (around 3.50 am) and now the whole city seems calm and quiet. Still there is no any official update on what was happend few hours ago. But it seems something serious did took place over the skies of the city.

      We were all glued to the televisions, watching intensely the Sri Lankan run chase against Aussies in World Cup finals. Suddenly around 1.10 am there was a power failure, blaming at electricity board for poor load balancing me and brother went to room expecting we will get the electricity in few minutes.

      Just then we heard the sound of an aero plane going along the sea side and we didn't take it much seriously at that time. But me and my brother did talk about the possibility of an air raid over the country at this time, where all are concentrating nothing but on cricket.

      Suddenly around 1.30 am we heard a noise similar to of firecrackers and then suspicious loud blasts heard which is definitely not of firecrackers. Then I called one of my uncles (they had electricity in their area) and he said it should have been just fire-crackers as Sanath did hit 3 successive boundaries at that very moment. We also then thought the same as whole city were enjoying and partying. Meantime we were trying to contact electricity board to ask what went wrong with our power line.

      Around 2.00 am, another uncle from Kandy did called and said about this news of Air-raid over the city. Few phone calls afterwards, did confirmed the above story. At this time there were lot of rumors were flowing about attacks in air-port, harbour and Kelanitissa Power plant and it was also certain that about 80% of the city is in dark at that moment. When I did check on official web site of ministry of defense and tamilnet over GPRS there was the news of possible air strike in Colombo.

      Around 3.00 am, again we heard the sound of an air-craft flowing over, and what did we saw were tracer bullets are being fired at it. We heard clearly the noise of a gun-fire.

      As I'm writing this again I heared some firing and not sure what it's all about. Again now it's quiet and it's 4.37 am in Colombo.

      UPDATE : Official reports now confirms about the bombing of oil power plants in Kerwalapitiya and Kolonnawa. Kerawalapitiya plant is in very close proximity to our house in Wattala.

      ]]>
      Confusing Web 2.0 Domains http://laktek.com/2007/05/14/confusing-web-20-domains http://laktek.com/2007/05/14/confusing-web-20-domains/#comments Sun, 13 May 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/05/14/confusing-web-20-domains How do you define a web 2.0 application ? Normal distinguishable features includes AJAX, rounded corners, large input fields and badges. But did you notice a another feature which is common to most of the apps of web 2.0 culture ? Most of them have confusing and difficult to spell domain adresess or URLs.

      It's a PITA when introducing these apps to my non-geeky friends. Here is how normal conversation would follow.

      me : Do you know that cool project management tool called basecamp ?

      buddy : Nope. What is the site basecamp.com ?

      me : no pal u hav to add hq to end it's basecamphq.com

      buddy : or maybe I'll visit to company site and use the product that seems much easier.

      (types the URL)

      buddy : There is no site called ThirtySevenSignals.com ???

      me : no no mate it's 37 numeric 3 and 7!

      buddy : Oh! this makes me sick. Show me something not that geeky in web 2.0 world...

      me : ah! you gotta visit flickr that's awesome photo sharing service :)

      buddy : What ??? seems like they are out of the business that domain is for sale

      me: Are u gone crazy ?? I'm using it now

      buddy : I'm not it seems you are. You said F-L-I-C-K-E-R.com right Flicker.com ??

      me : nope buddy it's F-L-I-C-K-R not E-R ..

      buddy : how can remeber all these weired names ?

      me : thats why you should use a bookmarking service to save ur brain from interpreting these URLs

      buddy : tell me such a site

      me : Delicious

      buddy : site may be delicious but how do go to it ?

      me : Just type DEL dot ICIO dot US buddy : what DEL ? you mean the DEL key in the keyboard ??

      me : Just GOOGLE for the site

      buddy : excuse me how many O's in between the 2 G's ?

      me :

      ]]>
      .app TLD for Web Applications http://laktek.com/2007/05/18/app-tld-for-web-applications http://laktek.com/2007/05/18/app-tld-for-web-applications/#comments Thu, 17 May 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/05/18/app-tld-for-web-applications Web applications have become the next trend of the internet and it seems the day it'll replace traditional desktop apps are not that far. But still web applications aren't the easiest to use for average users one of the major barriers is accessing the application itself. As I mentioned in my previous post most of the web applications have got confusing domain names.

      This is mainly due to the unavailability of top level domain extensions for simpler and common terms. So how about introducing a separate TLD for web applications ? I feel .app would be the most suitable extension. It will make web application stand-out from the information only web sites. This would be a similar concept to .mobi extension which was intended to serve only Mobile Content.

      What do you think about this idea ?

      UPDATE

      Flashback to 2011, this idea is on the verge becoming a reality. Since ICANN opened the doors for registration of custom gTLDs, there are several non-commercial initiatives such as http://dotappapp.com/ trying to secure the rights for .app gTLD. They are currently in the phase of raising the necessary funds for ICANN consideration.

      ]]>
      Daily reads for wannabe WebDs http://laktek.com/2007/05/22/daily-reads-for-wannabe-webds http://laktek.com/2007/05/22/daily-reads-for-wannabe-webds/#comments Mon, 21 May 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/05/22/daily-reads-for-wannabe-webds This is a collection of sites, which I recommend for daily use for the people who are eager to pimp their web design/development skills. This was initially prepared to send to my fellow batch mates at the university, but later I thought why not share it with the rest of the world too :) These resources may not be nothing new for seasoned players in the game, but I hope noobs may have something to reap.

      Have any other recommendations on your personal experience ?

      ]]>
      Let's bid farewell for Prof.V.K.Samaranayake http://laktek.com/2007/06/08/lets-bid-farewell-for-profvksamaranayake http://laktek.com/2007/06/08/lets-bid-farewell-for-profvksamaranayake/#comments Thu, 07 Jun 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/06/08/lets-bid-farewell-for-profvksamaranayake Today would be a day of mourning for Sri Lankan IT industry, with the demise of Sri Lanka's "Father of IT", Prof.V.K.Samaranayake. Many will unanimously agree today Sri Lanka's IT industries successes are mainly due to visionaries of Prof.Samaranayake.

      Beginning with CINTEC and later through ICTA, the work he has done for the development of IT industry in the country will be ever remembered. Introduction of BIT external degree program which enabled many students to get into IT field, Fueling the FOSS community by helping to launch the LSF, Sinhala language support in Windows and E-government project which digitized the procedures of many government institutions are some of the notable projects which took place under Prof.Samaranayke.

      The vacuum left by his decease would never be filled and whole country will miss him in the years to come. I give my sincere condolences to the family of Prof. and may rest in peace!

      ]]>
      Enjoying my Summer Of Code with SilverStripe http://laktek.com/2007/07/11/enjoying-my-summer-of-code-with-silverstripe http://laktek.com/2007/07/11/enjoying-my-summer-of-code-with-silverstripe/#comments Tue, 10 Jul 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/07/11/enjoying-my-summer-of-code-with-silverstripe First of all, I beg your pardon for not updating you with my summer of code experiences as I promised. Once I get busy with coding I tend to forget all other essential stuff which needs to go along with hacking. Yeah I know that's not a good practice as a FOSS developer and trying to get rid of that :)

      Anyway I had a very productive first half with SilverStripe CMS and it's a joy to see SilverStripe has already released parts of my project as modules. Also it's been fun to work with such courteous and like-minded bunch of developers who were from all parts of the globe. Compared with other FOSS organizations, SilverStripe is very young and trendy organization which would make a big impact on FOSS world in the years to come. Thumbs up for Google for supporting such a budding up organization through GSOC, if not for GSOC I'll never know project called SilverStripe CMS & Framework exist.

      So you'd be eager to know what I did with my project in last 1 1/2 months... If you recall the main objective of it was to provide mashup capabilities to SilverStripe CMS & Framework. I decided to achieve this by providing connections to web services via a RESTful interface. I prefer REST over RPC and I hope to talk about it in more depth in a separate post.

      As a summary, these are the tasks I completed during the period :

      1. Created RestfulServices class which can be extended to support any Restful web service API. (more info)
      2. Built FlickrService API extending RestfulServices. (more info)
      3. Created FlickrGallery controller, a page type users could add to their SilverStripe CMS to display set of Flickr photos as a gallery. (see the demo)

        Interesting thing about this is users could add this functionality completely using the WYSIWYG editor without requiring to write a single line of code (How's that sound ? :) )

      One of the complaints I got from my friends when I suggested them SilverStripe CMS was the lack of ready-made themes for it. Yeah, you get the default yet spiffy BlackCandy, but people just love to see more options. So to fill that vacuum for some extent I created the PaddyGreen theme, which I would release in the next few days. I wanted to make it trendy, simple and customizable, but there is still room for improvement. This is not part of my GSOC proposal it's just a pet project.

      What I've mentioned is just a tip of an iceberg, there are even more cool and great stuff to be released in the next few months to come from myself and my fellow GSOCers. So if you are in search of a user-friendly CMS or framework for your next project consider SilverStripe as well :)

      ]]>
      Refactoring Catalog http://laktek.com/2007/07/17/refactoring-catalog http://laktek.com/2007/07/17/refactoring-catalog/#comments Mon, 16 Jul 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/07/17/refactoring-catalog Refactoring holds same level of importance as coding the behavior. I enjoy refactoring my code as it leads to much cleaner code and also gives you the knowledge of how to do something better. But identifying what and where to refactor is something that a programmer gains through the practice.

      Recently I found a great catalog of commonly used refactorings, which is compiled by Martin Fowler.

      Visit or bookmark the link - http://refactoring.com/catalog/index.htmlÂ

      ]]>
      Love Puzzles ? Try these... http://laktek.com/2007/07/30/love-puzzles-try-these http://laktek.com/2007/07/30/love-puzzles-try-these/#comments Sun, 29 Jul 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/07/30/love-puzzles-try-these Warning : These may be quite addictive and would require you to consume lot of brainpower :)

      Da Vinci's Other Code - http://www.coudal.com/davinci.php School of Government - http://www.coudal.com/theotherfish.php Einstein's Fish Puzzle - http://www.coudal.com/thefish.php

      AND

      Which Porn Star Ate the Most Hot Dogs? - http://www.coudal.com/hotdog.php

      (via Coudal Partners)

      ]]>
      Silverstripe at Google Tech Talks http://laktek.com/2007/08/03/silverstripe-at-google-tech-talks http://laktek.com/2007/08/03/silverstripe-at-google-tech-talks/#comments Thu, 02 Aug 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/08/03/silverstripe-at-google-tech-talks Founders of Silverstripe, Sigurd Magnusson and Sam Minnee was recently invited to do a Tech Talk at Google. Watch their talk about New Zealand, starring in Lord of the Rings, how they destroyed the ring and Summer of Code.

      SHAMELESS NOTE : Watch around 27th minute to see Sigurd presenting the work I've done with GSOC.

      (You can download the full video from the Google Videos)

      ]]>
      JAVA to traded in Stock Market ?? http://laktek.com/2007/08/24/java-to-traded-in-stock-market http://laktek.com/2007/08/24/java-to-traded-in-stock-market/#comments Thu, 23 Aug 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/08/24/java-to-traded-in-stock-market Jonathan Schwartz has announced that SUN microsystems is going to change their stock trading symbol from SUNW to JAVA. This move gives an implicit meaning that the Java technology is going to be traded and people would be fooled to believe they are going to own the shares of JAVA technology not SUN microsystems. One would wonder was this the real motive behind releasing JAVA as an open source technology ? Spread the technology everywhere and use all it's popularity to raise an organization's stock value.

      ]]>
      Sweet Summer of Code, Thank you Google... http://laktek.com/2007/08/25/sweet-summer-of-code-thank-you-google http://laktek.com/2007/08/25/sweet-summer-of-code-thank-you-google/#comments Fri, 24 Aug 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/08/25/sweet-summer-of-code-thank-you-google GSOC 2007 finally came to an end on last 20th, providing loads of experience and lots of other good things to take with us. It helps to create passion for Open Source software among students, by providing an opportunity to work with more than 100 OSS projects in their summer holidays. Even it implies as a summer vacation program any university student in the globe can participate in the program. So students like me who is actually doesn't have such summer vacation (In Sri Lanka we don't experience seasons, thus no summer holidays) are also eligible to participate. Best thing for some, is the $45K stipend paid by the Google for the work done in the program. Honestly it's a great relief for our academic expenses for the year, but what I believe is the experience and reputation the program brings what is more important in long term.

      Talking about reputation, it's great to be in a community of different people all around the world (almost in every continent), who knows you by your first name. This was mainly due to Silverstripe was a small tightly knitted community since it was a budding up project. I guess I made the right move by applying to upcoming project like it rather than a big project like GNOME, Apache or Drupal where I would be just another contributor as many others. Silverstripe valued our work very much and took lot of effort in giving us the exposure. Most wonderful thing is our work being featured in the tech talk Silverstripe presented in Google Headquaters (thanks Sig!)

      I haven't done much contribution to FOSS prior to GSOC, but now I feel as a developer it's very essesntial to work on FOSS project in spare time. So I'll be continuing my work with Silverstripe CMS (and even with some other projects if I get more time). Not only it brings you credibility and reputation, but also you could polish your skills by learning from the masters.

      Here are some lessons I learnt from the GSOC, hope these help you as well.

      Polishing your coding skills

      None of the programming courses could teach the art of programming or the best practices. Those do only come with the experience. It may be simple as the placement of a parentheses or line indention but it's better to do it correct way. In Open source projects, where the documentation is rather mediocre, it's the code which does act as the documentation or the spec.

      In Silverstripe they had strong concern on the coding standards and best practices. Most of the stuff I learnt about the project was by looking at the code itself. Similarly many will try to understand how my module works just looking at it's source code. So lot of careful thought was needed on coding. Inspiration from the seasoned hackers of the core team gave me lot of inspiration (especially my mentor, Matt) to do the things in the right way.

      Now I treat coding as a creative exercise, like painter doing an art on a canvass. It's not just copy-paste and changing some stuff. This is something I would never learn by on my own, even if I do tons of freelance projects.

      How to set achievable goals

      GSoC is a program with a limited time frame (and any other project). Eventhough we may come up with a mind-blowing proposal due to practical constraints it's difficult to achieve all those within the period. So it's essential to break the the project into several chunks and achieve them in several iterations. This will allow you to focus on a single small target than having a large wayward target.

      Working under pressure

      One of the major challenges I faced in the GSOC was balancing academic stuff and project work. As I mentioned earlier in Sri Lanka we didn't have summer holidays so GSOC went along with the normal semester work. I couldn't make this an excuse since I assured to complete the proposed work in that time frame. Yet I didn't want to throw away this great opportunity came in my way. At the end, I felt I could keep the balance on both and was able to complete almost every assigned task by the deadline.

      BTW, I learnt that I must be willing to take such risks, if I need to go beyond an average.

      Communication and Collaboration

      Working with a team of developers who lives across the globe, in different timezones was new experience for me. Lot of decisions were taken based on the communication done via IRC, forums and Skype.

      Collaboration in development was mainly done with the use of Source code management system (Subversion) and Trac. Silverstripe offered special branch in the repository to GSOC, which enabled us to freely commit our patches without worrying much on confilcts with the trunk. Also they created separate areas for modules we develop in SCM and gave us the total control of the module.

      Supporting the Community

      Apart from development, developer in a open source project must be willing to help the community to use the product. Responding to questions coming from forum or IRC, was also part and parcel of the project assignment. I also did documented the usage of the modules I developed, as much as possible. As I understand support rendered by the developers to the users is one of the key factors for the success of an open source projects. If more support is available, more people are willing to try out the FOSS products.

      At last, I must not forget the surprise gift sent by the Google to us. Producing Open Source Software written by Karl Fogel, acted as an bible for me to understand the concepts and attitudes needed to have as an Open Source developer.

      ]]>
      The Renaissance http://laktek.com/2007/11/02/the-renaissance http://laktek.com/2007/11/02/the-renaissance/#comments Thu, 01 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/02/the-renaissance It's been more than two months since the last post in this blog, you would have thought this also as another blog getting into deadpool. If so your wrong ! Since I started this blog 1 1/2 years ago it's been the one which gave me the platform to air my thoughts and it brought me many opportunities. The temporary silence was just a retreat to comeback more strongly equipped. So here I'm again taking my next round in blogsphere with a completely redesigned site.

      Take a look around the site and drop your comments about the new blog. Watch for the next post for my complete walk through of this redesign.

      ]]>
      Walkthrough of the Redesign http://laktek.com/2007/11/03/walkthrough-of-the-redesign http://laktek.com/2007/11/03/walkthrough-of-the-redesign/#comments Fri, 02 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/03/walkthrough-of-the-redesign When I first launched my blog the main intention was to make my mark online. Without sticking into blogspot (Anyway, I had a blog on there for a short stint) or wordpress.com, I decided to go one step ahead and host the blog in my own domain with a customized template. That was a good move at that time and it did caught the eyes of several people. The blog enabled me to get into networks and I had a niche readership community composed mainly of local geeks. That was fine for a start.

      Later when my other engagements became priorities, as usual the focus for the blog got diminished. Also the readership didn't get expanded and it was stalled for a while. Some of my friends criticized about the blog being stagnant and from them I learnt the presentation of the blog too doesn't appear attractive enough for casual readers. In conclusion, I felt this blog needs an overhaul in terms of both content and the layout. Hence, I came up with redesign.

      It took me about one week to finish the whole thing and I'm constantly tweaking it to look better.

      Site Structure

      I felt having too many posts on homepage would make difficult for a reader on grasp the content. I decided to go with the tabloid style with giving more focus for the latest article and having next 3 most recent posts in a secondary level. Hope this will give better attention to each single post.

      Apart from original musings, I've added a separate section called 'masterpieces', to share the great stuff I find on the net. On the homepage five most recent masterpieces will be displayed. I recommend you to subscribe to its RSS feed and catch every beat (http://feeds.feedburner.com/laktek/masterpieces)

      Another new addition in this redesign is tagging for posts. I think the tag cloud on the footer will give a first time visitor a clear idea of the flavor of the blog.

      Design

      This is my first site designed based on the grid based layout. It was bit of an experiment and I really liked the concept. Anyway, it was Blueprint CSS framework made the process so easy. I went with the default 24 column grid and it was so convenient in positioning the elements. Blueprint also have a nice built in reset and typography styles, which took off load of css hackery from my ass. Oh ! should not forget the print stylesheet comes with it, which makes the site look smart in print without any effort.

      As a developer, I really love the Blueprint CSS. It made my life easy taking care of all those stuff I hate to do with CSS. That allowed me to focus more on the look and feel. My design buddies had slightly different idea of this concept, they felt it as somewhat bloated and too heavy. But for a occasional designer with development attitude Blueprint is the answer! (Remember how much developers hate the structure of CSS code itself). Hats off to Olav Bjorkoy for creating such awesome framework.

      The Platform

      Though I was looking at alternatives to replace Wordpress as the platform, I finally decided to stick with WP. Mainly for the stability of WP as a blogging system. This site now runs on WP 2.3.1 and basically WP is matured enough to cover all basic needs of an web blog from tagging to comment spamming. Also due to the large community around WP you never runs short of plugins. If you wished to have some feature, then some other has already done with it and also been generous enough to present it as a plugin.

      Anyway my only issue with WP is the templating system. That may be because I was working too much on MVC in the recent past. Feature of MVC based systems such as Silverstripe and Ruby on Rails is having clean templates. The clear separation of functionality from the view makes the life easy.

      Thats how the things went basically and I feel bit happy of the outcome. Anyway its success is thoroughly based on how you, the users anticipate and how much you find this place useful. That's the essence of user-centered design. So drop your comments, those will really help me to make this better.

      ]]>
      The Linux Desktop http://laktek.com/2007/11/09/the-most-fancy-desktop-os http://laktek.com/2007/11/09/the-most-fancy-desktop-os/#comments Thu, 08 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/09/the-most-fancy-desktop-os Last night, at last I decided to upgrade my machine to Ubuntu 7.10 (Gutsy Gibbon). In fact it's been almost one month since the release and I was bit reluctant to go for the upgrade early. I was bit busy with development work lately and my machine was stable with Fiesty Fawn (7.04). So I didn't want to take the gamble of upgrading early. Also, I experienced few chuckles in last upgrade with high demand hitting the Ubuntu servers and broken repos. The upgrade went smoothly and this seems to be the most stable Ubuntu version ever !

      With the new version of Linux kernel included with Gutsy, it seems the lifelong issues I had with the graphics and sound card are over. It seems sound works normally when resuming from a hibernation and also default speakers are muted when I plugin the headsets. These two trivial issues were bugging me in the past and I never found a workaround :) What I love most is Compiz Fusion working out of the box, without needing any extra effort to configure.

      ]]>
      Twitter Widget for SilverStripe CMS http://laktek.com/2007/11/15/twitter-widget-for-silverstripe-cms http://laktek.com/2007/11/15/twitter-widget-for-silverstripe-cms/#comments Wed, 14 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/15/twitter-widget-for-silverstripe-cms I released a new widget for SilverStripe CMS, which allows you to show your Twitter status in your SilverStripe based blogs. More Info and Instructions for usage.

      There will be more interesting stuff coming up for SilverStripe along with the 2.2 release. With the new version SilverStripe further simplifies your effort in building and managing web sites. So try out SilverStripe for your next web site and feel the difference..

      ]]>
      My contributions released officially ! http://laktek.com/2007/11/29/all-my-gsoc-contributions-released-officially http://laktek.com/2007/11/29/all-my-gsoc-contributions-released-officially/#comments Wed, 28 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/29/all-my-gsoc-contributions-released-officially Flickr GalleryYesterday along with its feature rich 2.2 release, SilverStripe has officially released the 3 modules I worked in the GSOC. This is a great achievement for me than successfully completing GSOC project itself. It's a great pleasure to see people around the globe consuming the my code to get the things done.

      The 3 modules will make your life easy when reusing the content available in Flickr, Youtube and Technorati in your site. So you could come up with wonderful mashups in SilverStripe platform. If you are a photographer, media creator or a blogger try using these modules with SilverStripe to create truly customized site for yourself (you'll amaze to see how easy it is).

      Love to hear your feedbacks and suggestions.

      Download SilverStripe 2.2 : http://www.silverstripe.com/downloads/

      Download the mashups modules from here : http://www.silverstripe.com/modules/

      ]]>
      It's your chance to fly high with Google http://laktek.com/2007/11/29/young-brothers-its-your-chance-to-get-high-with-google http://laktek.com/2007/11/29/young-brothers-its-your-chance-to-get-high-with-google/#comments Wed, 28 Nov 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/11/29/young-brothers-its-your-chance-to-get-high-with-google Google Highly Open Participation - LogoWhen I was in college, I envy Google Summer of Code for being only open to university students. Maybe it might have indirectly influenced me to get into the university ;). Anyway I ended my envy later by participating in GSOC. If you are also a college/high school student under 18 years and have such envy as I had, then here is a news for you ! Google has announced "The Google Highly Open Participation Contest".

      This can be called as the pre-university version of Google Summer of Code, where the objectives are similar - to get involved youngsters to contribute for the open source projects. Here you will have to work on tasks of the 10 listed open-source projects and you have to complete them by 4th February 2008. So I guess timing is perfect as you will be free from your school work in the holiday season. Also attractive prizes are on tray for successful participants. You will receive a certificate and tshirt if you complete a task and you will be paid $100 for every 3 tasks you complete. The 10 grand prize winners will be selected and will have the opportunity to have a paid trip to Googleplex (wow!)

      If you are considering to participate, then I would like to recommend you the project, SilverStripe CMS. SilverStripe codebase is purely XHTML, CSS, JavaScript and PHP5 based, so you will not require a high knowledge to start hacking. Also SilverStripe is a cool project which has lot of potential and you will be delighted to be part of such elite community ;). From my summer of code experiences I can tell you that the SilverStripe got very friendly community, so you will never run out of support. Have a look at SilverStripe's tasklist - http://code.google.com/p/google-highly-open-participation-silverstripe/issues/list. You will notice there are variety of tasks such as testing, designing and documenting where you don't even require programming skills.

      Anyway if you are considering to be a GHOPer under SilverStripe I'm more than willing to help you in whatever the ways I can. Get in touch with me.

      ]]>
      Goodbye 2007 http://laktek.com/2007/12/31/goodbye-2007 http://laktek.com/2007/12/31/goodbye-2007/#comments Sun, 30 Dec 2007 16:00:00 GMT Lakshan Perera http://laktek.com/2007/12/31/goodbye-2007 So today it see the end of another wonderful year in our lives. It's hard to believe how fast time travels, it was like yesterday I remember the dawn of year 2007. However, this constantly reminds us how short is our lives and how limited the time we got in our lives to do something worthwhile.

      Personally I feel satisfied on how the things went for me in 2007. Participation in Google Summer of Code was the major highlight for me in the year. Meanwhile, I was able to do good in my career at Vesess and in my degree. Still I see there is a room for improvement in terms of efforts and the skill. So my focus for the next year would be to working harder and become ruthless.

      Talking about the internet industry, we can call 2007 as the year of Social Networks. The rise of services like Facebook and Twitter, showed human factor is more important than the technology. People seemed to have become more comfortable with these services and social networking capability have become essential element for every successful internet application.

      Predicting about 2008, I feel it would be the year the mobile computing becomes mainstream. Along with this web based applications will take the lead over desktop applications in 2008. These two breakthroughs together will by empower the business and social lives of humans. We will see more sophisticated mobile devices inspired with Apple's iPhone (awaiting for phones with Google's Android Platform) and finally we may see WiMax (802.16) supported devices in the market. This would create a huge opportunity for web based applications in the coming year. Easy access for from any location, improved user experiences and more social networking capabilities will be the things to consider in building tomorrow's web applications.

      ]]>
      What's your next move after A/L ? http://laktek.com/2008/01/03/whats-your-next-move-after-al http://laktek.com/2008/01/03/whats-your-next-move-after-al/#comments Wed, 02 Jan 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/01/03/whats-your-next-move-after-al Results of the G.C.E.Advanced Level (A/L) examination for year 2007 was released yesterday and as usual we got to see lot of drama with mixed emotions over the results. I don't know whether there is any other more competitive exams in the world than A/L's in Sri Lanka. This is due to A/L is the pathway to enter to State University System (which provides free higher education). Anyhow this opportunity is limited, out of 250,000 students who sit for A/Ls only about 20,000 will get a chance to enter to a state university. So for many students (or at least their parents) what matters is not passing the exam, but passing it with flying colours to get selected to a State University (Unfortunately there were some cases even 3-A passes weren't adequate for university entrance). For students from families with lower incomes, this is the only hope in changing their living conditions. In contrast for parents in middle class, their child selecting to a state university is more a symbol of social esteem.

      However, it's very common that students are left with no idea of what to do after their A/Ls or the path they take (or forced to take) would lead them nowhere. As I see there are four common paths a student may take after their A/L, I would like to share my thoughts on the consequences of each of these, based on what I've experienced and seen. However, I'm just a undergrad and these thoughts are only from my point of view. Hope you would take them as a grain of salt. If you have a different viewpoint please do share them through the comments.

      Repeating the Exam

      This may look as the obvious step for the students who failed the exam, but in reality this is the option taken by most of the students including the ones who passed the exam. I know of students who start preparing for their 2nd attempt even before the results are released. I wonder the reason behind this, maybe lack of confidence on yourself ? As I feel, it's important to pass the A/Ls, since it's the stepping stone for any career or any higher education opportunity. So if you've failed the exam, it's better go for a 2nd attempt and try to pass the exam. What about the students who passed the exam, yet thinking of an second attempt ? There is a perception among some students (or their parents) that the sole intention of A/L should be getting selected to a state university to follow Medicine, Engineering, Management or Law. As I feel this is the myth that makes A/L more competitive and a rat race. There are many more better opportunities exist even in the state universities itself (such as IT, Microbiology, Textile Design and Industrial Management) than the above four disciplines. Actually in today's scenario, I don't believe one career could be better than another in terms of opportunities or perks. It's actually the person involved in the career can make that difference. So my advice, if you have gained a good z-score look for other degree courses available in State Universities and if you feel interested and have passion in that area apply for it. Think wisely, what do you want to be and your skills before wasting another precious year in your life repeating the exam.

      Enroll for a degree in a State University

      As I mentioned earlier do a reality check on your interests and passions before selecting a degree course to follow. Remember this decision will affect your whole career ahead. I have seen people becoming frustrated on their careers after a short period, and just doing the job for the sake of doing (which is even true for some doctors and engineers). I guess this is also one of the reasons for the low productivity in the country.

      My advice is don't just select a course just because it has a higher cut-off z-score. Imagine you have a great passion in Architecture but you have a Z-Score to become eligible for Engineering, what would you select ? When you have the potential to be the next Jeffrey Bawa, why would you settle as an average civil engineer ?

      So don't allow z-score to go over your dreams ;)

      Following a Professional Course or enroll for a Private Degree

      The facts I mentioned in the above context also applies here. Try to find unique areas of study which match with your capabilities. So don't just settle do CIMA, because your next door neighbor does it.

      Apart from that another thing you should consider is the quality of the private institute you are going to enroll. What are the professional bodies they are affiliated with, if they are providing a degree, is it a internal or external degree, what sort of facilities available (libraries, labs, lecture rooms) and what is the demand for these field in the job market.

      Finding a Job

      Finally, next popular option of among many students is to find a job after A/Ls. Actually, what you should be targeting is a career rather than a job at this point of the life. Don't settle for jobs where you don't see a clear career growth. Most of the people blindly fall into the trap of high starting salary, without looking at the opportunities available in evolving the career. Look carefully whether the job will lock you in doing the same task for the entire life. At least the job you take should encourage you to follow a professional course by providing flexible working schedule.

      ]]>
      It was all due to a fat finger ! http://laktek.com/2008/01/16/it-was-all-due-to-a-fat-finger http://laktek.com/2008/01/16/it-was-all-due-to-a-fat-finger/#comments Tue, 15 Jan 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/01/16/it-was-all-due-to-a-fat-finger After that bizarre story of Microsoft's support team calling a client back after 10 years, similar thing happend to me yesterday, from Dreamhost (my hosting provider). They have sent me a payment overdue notice as of February 1st, 2009. Initially, I was wondering whether these tech companies are doing time travelling ? However later revealed as in the case of M$, Dreamhost too have made a typo while entering the figures....

      Today I've got an apology from Dreamhost billing team, and it almost made my day :P

      Hi Lakshan!

      Ack. Through a COMPLETE bumbling on our part, we've accidentally attempted to charge you for the ENTIRE year of 2008 (and probably 2009!) ALREADY (it was all due to a fat finger)!

      We're really really realllly embarassed about this, but you have nothing to worry about. Please ignore any confusing billing messages you may have received recently; we've already removed all those bum future charges on your account and fixed everything up.

      Thank you very very much for your patience with this.. we PROMISE this won't happen again. There's no need to reply to this message unless of course you have any other questions at all!

      Sincerely, The Foolish DreamHost Billing Team!

      Maybe adding humor is the best way to cover your mistakes...

      ]]>
      RMS in Sri Lanka http://laktek.com/2008/01/19/rms-in-sri-lanka http://laktek.com/2008/01/19/rms-in-sri-lanka/#comments Fri, 18 Jan 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/01/19/rms-in-sri-lanka with RMS

      The week ended was a great week for Sri Lankan FOSS community, as the father of Free Software Movement, Richard M. Stallman (RMS) paid a visit to the country. Yesterday, I got the opportunity not only to listen to a live speech of the legend, but also to grab a picture with him. It was at the main public event which held at SLIIT, Malabe, which was a full house !

      RMS delivered a humors and really enlightening talk, which made everyone to have a self-retreat and understand how they are tied of with non-free software. Another important issue he brought up is the use of non-free(proprietary) software in schools and universities, which leads to tie users into these evil software for their entire lives. I think this issue should be taken seriously by developing countires like us, where we dream of a having standalone and stable economical environment without getting arrested by the multi-national firms. In the next decade this issue will take more concern with the growth of the IT market. It's important for the country to produce IT professionals who know the concepts solidly without being dependent on the software to achieve that. Use of free software could provide ideal foundation for this.

      Also in his speech RMS mentioned the easiest way for anybody to contribute and advocate free software. It's by always calling the system GNU/Linux (not Linux only). If you are a keen follower of the FOSS world you will know this is the longest standing holy war in the community, but for me it seems GNU/Linux is the term we should use. Because GNU/Linux referrs to the great philosophy behind the whole movement not just the software.

      It's a pleasure to see such great people here in Sri Lanka and kudos for ICTA for their efforts in this endeavor.

      Hail the St.iGNUicious (alias RMS)!

      ]]>
      Still Alive. http://laktek.com/2008/06/18/still-live http://laktek.com/2008/06/18/still-live/#comments Tue, 17 Jun 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/06/18/still-live I know most of you may have already assumed this blog is dead and removed from your RSS readers too. However it's not..I always have the intent to keep on blogging, but lack of consistentcy and pure laziness of mine didn't really help the cause.

      During the past six months of silence, actually I tried to realign my thoughts, my life. To be frank, I was not happy where my life was heading and with the things happening around me the future seems to be filled with lot of uncertainty. When I was 17, I had a dream where I wants to be and today I have almost achieved what I dreamt. Only thing is when I taste the reality of those dreams today, it doesn't feel much comfort and cosy as I wished as a teen. However today I don't see where I would be after 5 years (actually I don't want to see). I cannot predict how my environment will change in these 5 years. I will have to adapt to it on whatever the circumstances. That's why I feel more comfortable living for the day and not having concrete goals. So I could take the opportunities as it comes and could live without worrying on what I've achieved. At the end of the day, if I could bring some happiness to the people around me, then I could feel satisfied. Past few months, I tried live in this approach, by spending more time with people who are closer to me and also engaging in some new activities. It actually gave me much better feeling than just being stuck with geekery.

      If you still read this and wonder whether I've lost all my appitetie for hacking... Not quite so. During this period I also worked with my colleagues at Vesess, developing a simple billing app called, CurdBee. After lot of effort, we were able to make it available for public yesterday. It's free, so you like to try it register for an account. It's our first mass market web app and it gives me bit of self-satisfaction for being part of it.

      ]]>
      Passion http://laktek.com/2008/07/14/passion http://laktek.com/2008/07/14/passion/#comments Sun, 13 Jul 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/07/14/passion Most of us do work hard in our lives, but how many of us achieve the goals we strive for? Complains such as "I never missed a class and never missed homework but I couldn't get through the exam" or "Though I work 12 hours a day I'm not getting a promotion in my job" are very common in our society. Is it just the bad luck or is there any better reason for this? As I feel it's mainly because we don't have a true passion on what we do.

      crammingSuccess don't have a strict co-relation to how much effort you put in or how much time you spend on it. There may be subjects you don't understand a shit, but you could parrot read and score good grades in exams. There may be jobs which you hate, but gives you an increment if you don't take a leave whole year. But can we define these as success? At the end of the day, if you feel unsatisfied over what you do and if you don't see it adding any value to your life, it may not be your real passion.

      People who cram, always lives in uncertainty, "What if the structure of the paper get changed this year?" "What if Java language looses its popularity?". If you have a real passion, you may not feel insecure about your future. Passionate people can easily adapt to the changes. They could even foresee those changes. How? Because they are ready for the challenge. They have gone to the depths of the subject and they are confident about themselves. Also as I've experienced if you have passion, there's no shortage of opportunities.

      Can we create passion? Passion for something involves your emotions, it's something you love to think, love to talk and it will never make you feel bored. Most of the time passion matches with your core skills. Loving something because others succeed in it or others earn better in it is not the true passion. So it's better to think about yourself and try to understand what is your true passion.

      Photo Credit : Cram time (winter) by Pragmagraphr - http://www.flickr.com/photos/sveinhal/2075747765

      ]]>
      Passenger - Holy Grail for Ruby Deployment http://laktek.com/2008/07/17/passenger-holy-grail-for-ruby-deployment http://laktek.com/2008/07/17/passenger-holy-grail-for-ruby-deployment/#comments Wed, 16 Jul 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/07/17/passenger-holy-grail-for-ruby-deployment Popular notion about developing web apps with Ruby on Rails (or other Ruby frameworks) was "You can write a web app in 15 minutes, but it will take 15 days to deploy it correctly". Especially if you are coming from the world of PHP, where you have to just write the app and upload it, this might have been utterly confusing. Having to juggle with Apache/Nginx, Mongrel clusters, or FastCGI, just to deploy a simple web app would surely have sounded a nightmare. You must have wondered countless number of times why doesn't it just work as in PHP ??

      Well, finally you're prayers have been answered! Enter Phusion Passenger( a.k.a mod_rails or mod_rack), a module for Apache web servers, which will make deploying ruby app just a breeze. Yeah, like in good old PHP now you can now just upload your app and you're live! voila!

      Thanks to Passenger, now you can use the same Apache web server where you hosted your legacy PHP apps, to deploy your Ruby apps. Also, if you cannot afford to pay for a VPS, you can even use a cheap Shared host such as Dreamhost (which already supports Passenger). If it was the hassle of hosting the apps, which prevented you from developing in Ruby apps, it should not be a problem anymore.

      So how I get up and running?

      Installing Passenger is simple and the process is unbelievably user-friendly. All you have to run is just two commands in your terminal. (I assume you have installed Apache 2, Ruby and Ruby gems)

      sudo gem install passenger

      and then run,

      passenger-install-apache2-module

      Passenger installation screen

      Then you will be presented with the following guided installation process. Just do as it say and you're done!

      If you are using Dreamhost, Passenger comes built-in with the hosting package. You have to enable it for the domain you wish to host your app, through the Dreamhost control panel.

      Once you've installed (or enabled) Passenger, you could just upload the files using your favorite FTP program (or using SFTP/SSH)

      When you make updates, you will need to restart the application to reflect the changes. It's also not that difficult all you have to do is to create a blank file called 'restart.txt' on your application's tmp/ directory. You could do this by running following command in the terminal.

      touch /webapps/myapp/tmp/restart.txt

      If you are used to automated deployment using Capistrano, deploying apps can be even simpler. Here is a sample recipe on how to do that.

      You need further information about Passenger, the offical user guide and following collection of resources can be helpful.

      Kudos

      Few months ago there were lot of dispute regarding complexity of Ruby deployment, then DHH (creator of Rails) openly invited to someone to tackle the challenge rather than just complain. In this context, Phusion, a small company in Dutch came up with Passenger. It's a truly a great effort which made our lives easy. Kudos for the fine folks over at Phusion, you guys really rock !

      ]]>
      Contact Form using Merb & DataMapper http://laktek.com/2008/07/28/creating-a-contact-form-using-merb-datamapper http://laktek.com/2008/07/28/creating-a-contact-form-using-merb-datamapper/#comments Sun, 27 Jul 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/07/28/creating-a-contact-form-using-merb-datamapper Major benefit of using Rails to develop web applications is its smart conventions, which promote developers to adapt to common patterns and avoid wasting time in reinventing the wheel. Rails is strictly bonded with a database and has a pre-defined directory structure. Though these conventions are helpful in common cases, it reduces the flexibility of its use in custom contexts.

      If you are looking for a configurable yet smart framework, you should try out Merb. Approaching to version 1.0, Merb is quite stable. I found it's quite productive for lot of cases where Rails would've been too bulky. Using flexible ORM layer DataMapper, with Merb makes the things even more simple.

      Here is a simple example of using Merb and DataMapper to implement a Contact Form. Contact forms are used to mail the user feedback to site administrator. Generally, there's no need to store the information in a database. However it may be useful to do validations and Spam detection before sending the information. This can be implemented with Merb, by creating a flat application which would only have a single file and a configuration directory. Further DataMapper can be used to perform validations without creating a database.

      Getting Started

      First of all, you need to install Merb and DataMapper gems. I recommend you to install from the latest edge releases. This guide will show you how to do that.

      Now lets create a new Merb app.

      merb-gen app myapp --flat

      By issuing directive "--flat" generator will create a flat file structure for the app. Basically, it would be two directories for configurations(config) and views, along with the application.rb, which will contain the code.

      Configurations

      Add the following stuff in 'config/init.rb' to prepare the application environment.

      First we'll define the routes. Similar to Rails, Merb also supports RESTful routes. We'll create a singular resource named 'contacts', to map to Contacts controller.

      Merb::Router.prepare do |r|
        r.resource :contacts
      
        r.default_routes
      end

      We need to have Merb Mailer, Merb helpers (form controls, etc), DataMapper-Core and DataMapper-Validations gems for this app. So lets define those dependencies.

      require "merb-mailer"
      dependency 'dm-core'
      dependency 'dm-validations'
      dependency "merb_helpers"

      Next, Specify the SMTP settings of your mail server, which would be used to send the mails.

      Merb::Mailer.config = {
          :host   => 'mail.example.com',
          :port   => '25',
          :user   => 'test@example.com',
          :pass   => '',
          :auth   => :plain,
          :domain => "example.com" # the HELO domain provided by the client to the server
        }

      Implementation

      Now let's move into 'application.rb' where we do the essential stuff.

      Let's define a Contact class, which will be the DataMapper model. In configurations we didn't setup a database connection for DataMapper since we are not going to store information. So DataMapper will use the default Abstract adapter. This is good enough to provide ORM behavior without persistence. In the model we have defined the fields and the required validations. DataMapper offers smart validations such as "validates_format :email, :as => :email_address", which saves you from juggling with regular expressions.

      class Contact
        include DataMapper::Resource
      
        property :id, Integer, :serial => true
        property :name, String
        property :email, String
        property :message, String
      
        validates_present :name, :email
        validates_format :email, :as => :email_address
      end

      Next we have to setup the controller. Our controller will have two actions - show and create. Restful routes will match GET requests with the path '/contacts' to the show action. Similarly POST requests with the same path will be mapped to the create action.

      class Contacts < Merb::Controller
        def show
          @contact = Contact.new
          render
        end
      
        def create
          @contact = Contact.new(params[:contact])
          if @contact.valid?
            send_info
            render "Thank you."
          else
            render :show
          end
        end
      
      private
        def send_info
          m = Merb::Mailer.new :to => 'lakshan@web2media.net',
                               :from => @contact.email,
                               :subject => 'Contact Form Results',
                               :text => "#{@contact.name} (#{@contact.email}) wrote : \n #{@contact.message}"
          m.deliver!
        end
      end

      We defined a private method 'send_info' to handle the mailer functionality.

       <%= error_messages_for :contact %>
       <% form_for :contact, :action => url(:contacts) do %>
            <div><%= text_control :name, :label => 'Name' %></div>
            <div><%= text_control :email,  :label => 'Email' %></div>
            <div><%= text_area_control :message, :label => 'Message' %></div>
            <%= submit_button 'Send' %>
       <% end %>

      These form helper tags comes with the merb_helper gem.

      Done !

      We have done the coding, now lets see how it works.

      To start your app, in the terminal go to your app directory and run the command merb. Then direct your browser to 'http://localhost:4000/contacts' (normally merb loads in port 4000). Check whether it works correctly.

      That's all folks! Isn't Merb sounds great?

      I have uploaded the complete source of the sample to github - http://github.com/laktek/contact-form.

      ]]>
      Simple command line todo list http://laktek.com/2008/08/28/simple-command-line-todo-list-manager http://laktek.com/2008/08/28/simple-command-line-todo-list-manager/#comments Wed, 27 Aug 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/08/28/simple-command-line-todo-list-manager They say pen and paper is the best way to manage a todo list. Following the popular norm, I also started tracking my todos with pen and paper. But after several unsuccessful attempts of finding a pen or deciphering tasks from torn or soaked paper, I felt keyboard and the pixel-screen would be more accessible than the traditional method. Without relying on any sophisticated applications I relied on the most simple form - plain-text. It was better, but after being inspired from efforts like todo.sh and todo.pl, I thought of coming up with a small command line utility of my own.

      Enter todo gem!

      This resulted in coding my first ruby gem - todo. It is just a simple command line utility for managing todos. I didn't want to loose the flexibility offered by plain-text lists, hence it will use human readable YAML form, to store the lists. This enables you to use your favorite text editor to edit these todo lists. Further it supports tags. You could list the tasks by tags, thus making things smart and easy.

      Todo gem will run specific to a directory. This enables to keep different task lists in each directory. For example, I keep a todo list in my 'home' directory which holds all my housekeeping stuff and then I have separate lists for each project I work on in their respective directories. This separation allows better organization of the things to be done.

      Example

      Here is a basic example of how todo gem works:

        #visit your project folder
        cd projects/newapp
      
        #create a new todo list for the project
        todo create
      
        #add a new task
        todo add "write the specs"
        - add tags : important, due:24/08/2008
      
        #listing all tasks
        todo list --all
      
        #listing tasks tagged 'important'
        todo list --tag important
      
        #removing a task by name
        todo remove "write the specs"
      
        #removing a task by index
        todo remove -i 1

      Get it!

      To install todo gem in your machine simply run;

      sudo gem install todo

      Also the code of the gem is hosted in github, so you could fork and flip it in the way you want (and don't forget to send me a pull request).

      ]]>
      Startups and Real Life http://laktek.com/2008/09/28/startups-and-real-life http://laktek.com/2008/09/28/startups-and-real-life/#comments Sat, 27 Sep 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/09/28/startups-and-real-life Yesterday I participated in the first Academic Symposium organized by our faculty. The topic for the day was Graduation, Entrepreneurship and Success. The panel, which was modereated by Peter D'Almeida (he is also an old Benedictine) included Dr.Sanjeeva Weerawarna, Wegapitiya (Laughs), Harsha Purasinghe (Microimage), Ramesh Shamuganathan (JKCS) and Mohammed Azmeez (Concept Nursery). The discussion raised some insightful and interesting thoughts on passion, ideas, startups and funding.

      However, as many other startup related discussions, it paid very little attention on the social factors involved with running a startup. In my perspective it's the most challenging aspect, than how to build the product or how to raise funds.

      They stressed on the point that you should only pay attention to your passion and work continuously to achieve success. Basically it should be your top priority. What does this means? You have to leave behind your family, spouse and friends to drive your goals. One day you may be in Fortune500 list, but can you be satisfied with your life after neglecting your near and dear ones? Can this way of life is happy and responsible?

      From my little experiences what I see is if you are to solve real world problems you have to live in the real world. This is a point where most of the tech companies have gone wrong. For them the real world sucks (as in Jerry Seinfeld and Bill Gates commercial), but the truth is you cannot understand it only by logic, theories or research. If you are to understand the real world you have to live in there.

      If you become detach from real world and be isolated to work, you are producing imaginary products. You are forced to believe world would embrace any stupid idea you throw at them. Tech blogs may call your idea a Paradigm Shift and VCs are ready to invest lucrative amounts of money. But unintentionally you're taking the world away from the reality. You should not be surprised if someone thinks he is safe from peak oil and food crisis because he has enough oil wells and fields in Second Life. However, the reality is there are more basic problems in this world which never even caught the attention of this so called web 2.0 space. To find these problems you need not to go to other corner of the world. These needs are already within our everyday life.

      So don't let startup fantasies to ruin your real life.

      ]]>
      Love what you do http://laktek.com/2008/10/15/love-what-you-do http://laktek.com/2008/10/15/love-what-you-do/#comments Tue, 14 Oct 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/10/15/love-what-you-do In "Maybe you can't make money doing what you love", Seth Godin challenges the conventional wisdom about careers. He argues sometimes converting your passion in to a profession may not work. Though most of us may not willing to accept, this is the harsh reality of life. Rather than doing what you love, you have to begin loving what you do.

      No matter whether you are employed in a large organization or running your own startup you cannot expect to work only on what you love. If you are in a larger organization you may have to bow to your boss and do whatever he orders, if you need to secure your paycheck. Though you have creative and innovate ideas, going over the organizational restrictions and policies may not be easy. You may think if you were running a startup you would have freedom to go for your heart desires. However in reality it's also not that sweet as it sounds. If you want your startup to make profits and have a steady cash flow, you always have to operate with scarce resources. So you will need to go beyond what you love to do.

      Imagine bunch of kickass coders working for towards creating the next killer web app. If the app's interfaces are confusing, servers are clunky, support is poor and the business model is vague, it will be in deadpool in 2 weeks time. In reality the code is only 10% of the whole mission. In a startup, you need to match highly differentiating set of tasks by yourself. You cannot avoid anything saying that's not my cup of tea.

      You cannot expect the world to spin the way you want. You have to embrace whatever comes to you and turn them into your own good. If you wait till your perfect time comes it will be too late. Secret mantra of many successful people is their multifaceted characters. Are they born with these skills gifted? In my belief they have developed these skills by loving what they had to do.

      No matter how much passions you have, if you don't know how to market, how to face to the challenges and how to create opportunities there's very little chance you could be really benefit from them.

      ]]>
      Really Simple Color Picker in jQuery http://laktek.com/2008/10/27/really-simple-color-picker-in-jquery http://laktek.com/2008/10/27/really-simple-color-picker-in-jquery/#comments Sun, 26 Oct 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/10/27/really-simple-color-picker-in-jquery Recently, I needed to use a color picker with predefined color palette for my work. Thanks to many enthusiastic developers, there are several popular, sophisticated color pickers already exist for jQuery. However, most of these plugins looks complex as if they are made to be used in a online image editor. They are overwhelming for simple usage and less flexible for customization. So I had to write my own simple color picker from the scratch.

      Usage of color picker is very straightforward. Users can either pick a color from the predefined color palette or enter hexadecimal value for a custom color. Compared to other plugins, it is very lightweight (it's only 5KB without compressing) and obtrusive to use. It doesn't require any dependencies apart from jQuery core and uses simple HTML/CSS for presentation. You have the ability to easily customize the default color palette by adding more colors or replacing the palette with completely different set of colors.

      Want to try?

      If you want to see a demo before trying out by yourself, here is a simple demo of the plugin.

      Download Color Picker via GitHub

      Usage

      Color Picker requires jQuery 1.2.6 or higher. So make sure to load it before Color Picker (there's no other dependencies!). For default styles of the color picker load the CSS file that comes with the plugin.

      <script src="jquery.min.js" type="text/javascript"></script>
      <script src="jquery.colorPicker.js" type="text/javascript"></script>

      Add a text field to take the color input.

      <div><label for="color1">Color 1</label>
      <input id="color1" name="color1" type="text" value="#333399" /></div>

      Then call 'colorPicker' method on the text field when document loads.

       jQuery(document).ready(function($) {
          $('#color1').colorPicker();
        });

      Your favorite colors are missing? Just add them to the palette

        //use this method to add new colors to palette
        $.fn.colorPicker.addColors(['000', '000', 'fff', 'fff']);

      Or completely change the color palette as you need...

        $.fn.colorPicker.defaults.colors = ['000', '000', 'fff', 'fff'];

      That's all you have to do!

      Future Improvements

      This is only the initial release of Color Picker. There may be wild browser bugs or you may find smarter ways to improve the functionality of the plugin. I'm open to all your suggestions and complaints. Leave a comment here or contact me directly.

      Further, the code of the plugin is available via GitHub, so if you feel like forking it and playing with it please do!

      Update

      Plugin now supports all major browsers including IE6! (Thanks muser for the patch)

      Update #2 (October 14, 2009)

      Color picker will automatically update it's color when the value of the input field is changed externally. (Thanks John for initially identifying the issue and Sam Bessey for pushing me on to this change :) )

      Update #3 (February 17, 2012)

      Made a significant change to support transparancy and other additional options. Special thanks for the contributions from Daniel Lacy. Please refer the README for more details.

      ]]>
      Extended-Bort: My base Rails app http://laktek.com/2008/10/31/extended-bort-my-base-rails-app http://laktek.com/2008/10/31/extended-bort-my-base-rails-app/#comments Thu, 30 Oct 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/10/31/extended-bort-my-base-rails-app Bort is an awesome base Rails app, which allows you to get into real action without wasting your time on setting up the most common and boring stuff. It comes with RESTful Authentication, OpenID support, Capistrano Mutli-stage deployments and many other essential plugins, thus lifting good work load. I first got to use Bort when developing MyConf for Rails Rumble, where agility mattered to the maximum. Since, I felt it would be ideal to use Bort as the cookie cutter for my future Rails apps as well. However, I felt there needs to be several changes to make it more ideal for my workflow. Hence, I forked Bort and came up with Extended-Bort!

      What are the changes?

      • Git Submodules are used to keep Rails and other plugins updated.
      • Included Rails 2.2.0 with tha app
      • Added annotate-models and make_resourceful plugins
      • Added Action Mailer initializer and SMTP settings for production mode
      • Uses admin email specified in settings.yml in exception notifier
      • Replaced rSpec story runner with new Cucumber Scenario Framework (webrat and cucumber plguins are included)
      • Replaced Prototype js with jQuery
      • Replaced asset_packager with bundle_fu for bundling assets
      • Changed Stylesheets by adding an initial stylesheet, application stylesheet and Hartija CSS print stylesheet

      Want to Use?

      If you feel like using Extended-Bort, follow these steps:

      bash git clone git://github.com/laktek/extended-bort.git git submodule init git submodule update

      Edit the database.yml and the settings.yml files

      bash Rake db:migrate

      change the session key in config/environment.rb and REST_AUTH_SITE_KEY in environments config (you can generate keys using rake:secret)

      Have a brew and celebrate (from original Bort guys, but you can still do it ;) )

      ]]>
      If Rails is a Ghetto, Merb is a Whorehouse http://laktek.com/2008/11/20/if-rails-is-a-ghetto-merb-is-a-whorehouse http://laktek.com/2008/11/20/if-rails-is-a-ghetto-merb-is-a-whorehouse/#comments Wed, 19 Nov 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/11/20/if-rails-is-a-ghetto-merb-is-a-whorehouse Don't get confused over the title, I'm not trying to punch the Merb community as Zed did to Rails. A grasshopper like me don't even qualify to do such a rant. I'm actually trying to pimp Merb!

      Merb is the newest addition to Ruby town. First, it was started just to satisfy the unfulfilled desires of some homies who lived in Rails ghetto. Soon they found Merb, as a sweet spot they could refresh and relieve the pressures they had in Rails. Hush-hush about Merb was spreading so fast, and it had it doors open for everyone from day one. However apart from the hustlers, many others backed-off mainly due to spread of FUD. In the midst of all the hate-games Merb had turned 1.0 and even Matz, the Godfather of Ruby Town, has given his thumbs up for Merb. It is no more a dark ally and it's here to stay.

      It's your Call

      Not all want bang with big Racks, your taste maybe for micros. You may want to roll with a mature like ActiveRecord or maybe you love to do a tenderly Sequel. How about relaxing in Couch? Nothing to be embarrassed, Merb knows how to satisfy you all alike. Just do it in your style, in Merb you could even cum_later run_later. Did I tell you, that you could bring your own toys (meh, slices) to Merb?

      Could I exposed to STD?

      Many fear that Merb will bring STD(Stupid Terrible Dependencies) to apps and systems. In fact this was a PITA in the early days of Merb, but with the power of Thor and new bundling strategy will help you to take care of yourself better.

      If you yearn for some real fun and action, now it's time to head over to Merbhouse!

      ]]>
      I want to be a Ruby Hacker... http://laktek.com/2008/11/21/i-want-to-be-a-ruby-hacker http://laktek.com/2008/11/21/i-want-to-be-a-ruby-hacker/#comments Thu, 20 Nov 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/11/21/i-want-to-be-a-ruby-hacker In the past couple of months, I heard this from number of my friends. I hope there will be more joining the club in the coming days. For the benefit of the freshers, I thought of sharing some tips I learned about Ruby hacking (though it may sound as obvious to many) .

      1. Write something that scratches your itch - Rather than blindly following some tutorial someone has written, try to solve one of the problems you have (such as simple todo list) and try to grasp the concepts during the process.

      2. Learn to read source code - Reading code is one of the best ways to learn how to code from great programmers. Especially in Ruby, the syntax is very easy to comprehend, so you could read code as you're reading a fairytale. If you are looking for a good book to begin with Ruby idioms, I recommend Why's Poignant Guide for Ruby, which inspired me to learn Ruby.

      3. Pragmatic Programming approach and Agile development is the way to go - If you are still accustomed to write a SRS and draw UML diagram before you begin to code, then you will not feel comfortable with the concepts of Ruby. Find and read Pragmatic Programmer of Andy Hunt and Dave Thomas, it will help you to refresh for a good start.

      4. Use a text editor - If you are coming from .Net or Java environments, you may be so obessed to use an IDE. But adopting to a lightweight text editor such as Emacs, Vim, Textmate (Mac only) or Gedit is a bliss. Because Ruby is all about crafting code, not dumping auto-generated piles of shit.

      5. Learn Git - Ruby community loves Git, for source code management. Many Ruby frameworks, gems and plugins use Git as their SCM. Besides Git can make your development workflow more flexible, productive and reliable. You may like to bookmark GitHub. Don't complain me if you are browsing it more time of the day than Facebook in 6 months.

      6. Use a nix Operating System - Ruby and related tools plays really nice in Linux, Mac and in other nix based Operating Systems. That doesn't mean it's not supported in Windows, indeed it is. But if you are looking forward to a deep dive in Ruby you will feel more comfortable with a *nix based environment.

      7. Don't live under a rock - Ruby community is very fast paced. There are lot of new plugins, gems and frameworks coming up every week. Also lot of best practices and useful tutorials are being blogged. So it is better to follow some blogs and podcasts to keep yourself updated. Personally, I recommend RubyInside, RubyFlow and RailsEnvy podcast to catch the best.

      8. Bonus Tip: Learn to be "Passive Aggressive" - You will need to understand passive aggressive behavior and practice it for your defence. Even doctors say it's good for your health :P

      ]]>
      Ruby Advent Calendar http://laktek.com/2008/11/30/ruby-advent-calendar http://laktek.com/2008/11/30/ruby-advent-calendar/#comments Sat, 29 Nov 2008 16:00:00 GMT Lakshan Perera http://laktek.com/2008/11/30/ruby-advent-calendar Inspired by the previous incarnations of Ruby advent calendars, I thought of running one for this year. For those who are unaware, idea is simple. To serve an interesting article on Ruby, each day for the first 24 days of December.

      We saw lot of cool stuff in the Ruby throughout the year, so it would be nice to reminisce on those and learn some new tricks for the year coming ahead.

      Like to Contribute?

      I really love to have your contributions. You could share almost any stuff related to Ruby culture. Please feel free to let me know (lakshan at web2media dot net) if you like to contribute. Let’s Make this fun and knowledgeable.

      Bookmark and Follow

      First post will go up tomorrow (Monday) morning. Be sure to bookmark! - http://advent2008.hackruby.com

      You could follow the daily updates via Twitter - rubyadvent

      Articles

      ]]>
      Belated Resolutions http://laktek.com/2009/01/11/belated-resolutions http://laktek.com/2009/01/11/belated-resolutions/#comments Sat, 10 Jan 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/01/11/belated-resolutions It is almost mid January and this is my first post for the new year. I'm looking at 2009 to be a switching year in my life. The days of adolescence have almost comes to its' end and real life responsibilities seems to be creeping up. The days for bullshitting are numbered, and "I'm still learning" and "I'm still trying" will no longer be accepted as excuses. So it's the high time to get real and face the music.

      Here is what I will be looking to do this year (in no particular order)

      • Fix my sleeping pattern (and cut it short to 5 hours)
      • Have a regular workout routine.
      • Turn CurdBee into a solid web platform.
      • Keep on Blogging
      • Launch hackruby.com
      • Toastmastering for real (and be a competent communicator)
      • Horticulture
      • More Driving (only if I could fill my gas tank)
      • Avoid becoming an RSS/Twitter junkie
      • AND finish off this Degree, which sucks in big time!
      ]]>
      Professional Education is Bullshit! http://laktek.com/2009/01/16/professional-education-is-bullshit http://laktek.com/2009/01/16/professional-education-is-bullshit/#comments Thu, 15 Jan 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/01/16/professional-education-is-bullshit Most of my university colleagues has this craze of following various professional education courses and certifications. They call it CIMA, BIT, BCS, ACS, SCJP, CCNA, MSDN, PMP and the list goes on. I don't get the rationale behind this, What's the benefit of having all these qualifications? What exactly you gain by spending such hefty amounts of money for these courses? Is it because you think you could decorate your CVs with all these bullshit? Or is it just for the sheer pleasure of seeing random Latin characters printed after your name?

      As I know most of these courses focus on single line of technology or certain set of standards, which has no guarantee to be relevant in another 2 years time. Also, comparing with the academic education these courses doesn't offer much diversity and depth either. You could easily get through these exams by parroting the mock question bank and puke it all at the exam. There is very little chance of anything retaining and absorbing to yourself.

      The reality is those qualifications or the grades itself won't make you brilliant. Those will just take you far away from the reality. It will give you and world that you are a qualified professional. But in reality, most of these people struggles to get things done and fails miserably at the real targets. Bill Gates, Steve Jobs, Sergy Brin and Larry Page they all have one thing common. All of them had to dropout from their academic careers, to reach their destiny. That doesn't mean you need to dropout to make a difference. However, what is evident is that the stuff you do, the challenges you meet, the problems you solve and the experiences you gain during your academic career, are what makes you different from the rest of the stack.

      Inspire yourself to gain some real life opportunities. Oragnize a Gig, Do a research (I mean a real one- not those tomfoolery), Start contributing to an open source project or create your own startup. Find something which matches your passions and engage with it. Don't just waste your precious time and money, by blindly running after professional qualifications that won't worth a shite.

      ]]>
      Get mocked! http://laktek.com/2009/02/25/get-mocked http://laktek.com/2009/02/25/get-mocked/#comments Tue, 24 Feb 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/02/25/get-mocked My first few days doing Maths in Advanced Levels was a nightmare. I usually sucked in Maths in high school and only because of my strong passion for IT made me do Maths (thats the only way in Sri Lanka to gain higher education in IT). I couldn't grasp a single shit, other than knowing some greek characters on the board. The tutor was ruthless, and sarcastic at his best. To make the matters worse, the class was full of opposite sex, who were awaiting to LOL at any insult thrown. I just sounded total dumbass!

      Tutor said I will not go beyond a simple pass even if I work my ass off. However, after two years of hard work, I proved him wrong by entering to University of Moratuwa, and pursue my childhood dream of a career in IT. It was the mockery at the class, made me strong and motivated me to bring out the best in me.

      It's natural to feel humiliated and give up when you get mocked by others, but try to turn them into your own advantage. Don't try to avert them or defend them. Let them mock you! Just keep believing in yourself and stick to what you do!

      ]]>
      Ruby Best Practices http://laktek.com/2009/04/14/ruby-best-practices http://laktek.com/2009/04/14/ruby-best-practices/#comments Mon, 13 Apr 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/04/14/ruby-best-practices First of all sorry for letting this space go on a hiatus yet again.. Though I tried to make it a habit of posting regularly, other priorities didn't allow me to do it as I wish. In last few months I had to run through lot of challenges in real life and in hacking, which I feel would be worthy to share. I promise that I will start posting them soon.

      Meanwhile, I'm glad to inform you that I will also be contributing to Ruby Best Practices Blog, which is a collaborative effort organized by Gregory Brown of Prawn fame. Rest of the core team of RBP includes well experienced and interesting developers such as, James Britt, Kirk Haines, Robert Klemme, Jeremy McAnally, Sean O’Halpin and Magnus Holm. So if you are passionate in writing smart and robust Ruby code, you would really enjoy this blog.

      Visit Ruby Best Practices blog.

      ]]>
      [Rails Tips] Reduce Queries in ActiveRecord with :group http://laktek.com/2009/06/13/rails-tips-active-record-querying-be-smart-and-vigilent http://laktek.com/2009/06/13/rails-tips-active-record-querying-be-smart-and-vigilent/#comments Fri, 12 Jun 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/06/13/rails-tips-active-record-querying-be-smart-and-vigilent I thought of sharing some tips in Ruby on Rails development, which would come in handy especially if you are a newbie. The cornerstone of all of Rails' magic is ActiveRecord. As you know it's an ORM, which hides all cumbersome and mundane SQL by a syntactic sugar coating. However, blind and lazy usage of ActiveRecord could really hurt your application's performance. I found this particular instance when revisiting code of an app, I have written in my early days of Rails. As a newbie overdosed with ActiveRecord's magic, had written a blunt piece of code which looks horrible and also would make the app painfully slow.

      1550 items in total (1350 Available, 150 Out of Stock and 50 Discontinued)
      This was the expected summary output. At surface, displaying such a block seems trivial, right? I had the following in the view:

      <%= @all_items_count %>; items in total (<%= available_items_count %> Available, <%= out_of_stock_items_count %> Out of Stock and <%= discontinued_items_count %> Discountinued)

      Then in the controller, I have explicitly assigned the ActiveRecord query results all four variables. ('acts_as_state_machine' plugin provides the count_in_state method.)

      def index
         @all_items_count = Item.count :all
         @available_items_count = Item.count_in_state :available, :all
         @out_of_stock_items_count = Item.count_in_state :out_of_stock, :all
         @discontinued_items_count = Item.count_in_state :discontinued, :all
      end

      Technically this works. But can you spot the issue here? Lets get to terminal and inspect the app's logs. When rendering this action, it sends a query to database to get each of the four values. Database queries are costly and will cause a slow response.

      Better Solution?

      If we could reduce the number of database queries this action would be very effective. So is there a way we could reduce the number of Database queries? Remember that we can group the results of an SQL query? You can specify the :group option in ActiveRecord's query methods. I modified the previous code to pass the :group option to the query:

      def index
         @items_in_state = Item.count :all, :group => "state"
      end

      Now, instead of four queries we are making only one database query. With the group option passed, ActiveRecord will return the results in the form of a Hash. So we can now grab the count of items in each state. Lets change our view to adapt to the changes we did in the controller.

      <%= count_items("all") %>; items in total (<%= count_items("available") %> Available, <%= count_items("out_of_stock") %> Out of Stock and <%= count_items("discontinued") %> Discontinued)

      Here, I used a simple helper method called count_items to make it more elegant. Here is what goes in the helper:

      def count_items(state)
        return @items_in_state.inject(0) { |count, s| count + s[1] } if state == "all"
        @items_in_state.fetch(state, 0)
      end

      To return the total count, we could use <a href="http://www.ruby-doc.org/core/classes/Enumerable.html#M003171">inject</a> method, which would iterate through the hash to take the sum. Also, keeping the basics in mind, we should index the database fields which is queried regularly. In this case, it is better to index of the state column of the table. Mistakes like this are very obvious and could be easily avoided if you do the things with a sense. However know the trade-offs, always keep an eye on what's happening behind at the backstage. Don't let the faithful genie to turn into a beast.

      ]]>
      Sticking to Basics http://laktek.com/2009/07/07/sticking-to-basics http://laktek.com/2009/07/07/sticking-to-basics/#comments Mon, 06 Jul 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/07/07/sticking-to-basics I love Test Cricket. I believe it has very close resemblance to our real lives, and it gives so much of inspiration. Sri Lanka's remarkable victory today against Pakistan after being at the jaws of defeat, yet another classic example for that. Pakistan were dominating in first 3 days of the game and it was almost certain they have sealed the victory by end of yesterday. Yet, within just one unfocused, carefree session of play they did let go all the good they did in past 3 days.

      On the other hand Sri Lankans, went to field today with a glimpse of hope of a victory. They needed to bowl out Pakistanis before within a mere target of 97 runs. Given the strong batting line up of Pakistan, it seemed daunting task and seemed only a miracle could reverse the result. But Sri Lankans did win comfortably at the end! without any miracles or magic. Sri Lanka's match winning bowlers - Murali, Vaas or Malinga was not even in the playing XI. Ajantha Mendis, the only trump card for Sri Lanka, did only had an ordinary game. So what changed the game? It was the hardwork of 3 average bowlers - namely Herath, Thushara and Kulasekara. They bowled with discipline and skipper Sangakkara kept the trust in them and rightfully exploited the opportunities.

      Pakistan skipper Younis Khan, very correctly explained what cost them the game - "When I am under pressure, I go back to my basics. They need to go back to basics too. Break it down into small-small sessions, be it batting, be it bowling, be it fielding. It's only a six-hour day, it shouldn't be that difficult."

      Isn't this what happens in our lives too? When things are going fine for us we tend to neglect our basics. Also, when things go wrong we do experiment and seek all sorts of other fixes - but forgetting to return to our basics. Today's game teached us the value of sticking to our basics. That's the secret mantra for success!

      ]]>
      Six virtues of being an IT undergrad... http://laktek.com/2009/07/08/six-virtues-of-it-undergrad http://laktek.com/2009/07/08/six-virtues-of-it-undergrad/#comments Tue, 07 Jul 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/07/08/six-virtues-of-it-undergrad If you were one of the early readers of my blog, you would know that I was selected to do my Bachelors in IT at Faculty of Information Technology in University of Moratuwa. It's hard to believe 3 years have gone pass in a flash, but reminiscing what I gathered during this period it feels really awesome.

      From my young age, I had the passion to play computers and internet. Those days I used to day-dream of building all sorts of awesome products and believe me I still have some of the pen-sketches of those ideas. When it came to Ordinary Levels I had already got the opportunity to do some work with local web development companies (especially with awesome folks at E-Fusion, the people who did kaputa.com - which became the trend setter of Sri Lankan portals). So my initial idea was to say good bye to formal education from O/Ls and get into work in IT for full time. I even had some friends at school, who were ready to work with me on a startup. However, my mentor at that time Mr.Niranjan Meegamana of E-Fusion, influenced me that I should continue with my secondary education and should pursue a degree in IT, if I really want to have a long journey in the industry. That motivation directed me to end up in selecting to Sri Lanka's only national degree course for IT.

      Initially I had few doubts how the things would turn up, mainly due to all the crappy stuff I had heard and seen about local universities. However, things turned out to be really peaceful within IT faculty and University of Moratuwa. There were zero-interruptions for the course and third-party influences were very much less. Honestly, I believe getting in to the IT faculty was one of the best things happened in my life and it opened me lot of opportunities to reach my ultimate goal.

      I thought of sharing some of those experiences and highlights hoping it would help to inspire someone else.

      1. It's Free

      Thanks to free education structure in Sri Lanka - I'm privileged to do my degree course for free, which would have cost more that USD$ 10-15K if I did it from a university in other country or in a private institute. Coming from a average middle-class family, I'm really happy that I could continue my higher education without being a burden for my parents.

      2. Converted to FOSS

      When I stepped into IT faculty, I was not a hardcore advocate for FOSS. But within few weeks due to the influences of FOSS advocates at the faculty - like Prabhath, Mifan, Anushke and Amila, I too converted to be a FOSS purist. Since then I haven't looked back and today free software culture has become an integral part of my life. Joys and benefits reaped of free software culture, fits into separate post. So will share more on that in future.

      3. Internship at a startup as a freshman

      As Paul Graham says "The way to learn about startups is by watching them in action, preferably by working at one". I got this opportunity from my freshman year. Prabhath, who happened to be my mentor and role-model at the university, invited me to join with them at Vesess. The inspiration, motivation I gathered just by watching how they work was immense. The experience you gain from the challenges at a startup cannot be matched by any other learning experience. Not only you get to solve problems that will matter in real life, you will see how people use what you build. Our product for online billing - CurdBee today has become one of the most essential apps for freelancers.

      4. Google Summer of Code

      Google Summer of Code program is a program hosted by Google to encourage university students around the globe to contribute to Open Source software. There is a strong interest for this program in the University of Moratuwa (In 2008, it happened to be the top university in the world, with most accepted students for GSOC). I also had the opportunity take part in GSOC in 2007, where I worked on Silverstripe CMS Framework. I successfully completed building a Mashups module for Silverstripe and released my work for public use.

      5. Opportunities to network

      As I believe most important stuff you learn at university is learnt out of the lecture room. I had the chance to listen and talk with lot of amazing people, who were from different walks of the life. Some of them were the visiting lecturers, seniors, own batchmates and some were just random dudes who accidentaly caught up to a little chat at the canteen while having a tea. No matter who are they, listening and sharing thoughts with them helped me to expanded my perspectives and change my attitudes towards life. During our usual after lunch banters - we go through unimaginable number of topics like music, cricket, geekery, hacking, farming, environment, oil crisis to politics to women, sex to religious philosophies. It's truly amazing what a lot of knowledge and experiences can be shared when such highly diversified group of people get together.

      6. Lurking on the internet gives you the fringe

      In IT you rarely get to parrotize long formulas or boring theories. You need not to burn yourself doing field researches, no need to waste your time with boring practicals. No need to run after seniors for 'kuppis' during the exam time. All you have to know is how to use Google and Wikipedia to get through all the academic stuff. I have sat for exams without having a single note and only reading Wikipedia. If you have a little itch to read more on a subject and keep yourself with the latest trends, you will have a fringe benefit over others. I don't think there could be any better academic course than IT, for a internet addict like me.

      It's not the qualification you gain from an academic degree course that matters, but the exposure, opportunities and experience you gain during the journey, will shape your future.

      ]]>
      Ban Schools & Education! http://laktek.com/2009/07/28/ban-schools-education http://laktek.com/2009/07/28/ban-schools-education/#comments Mon, 27 Jul 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/07/28/ban-schools-education It's very sad and alarming to hear the recent incidents taking place at Sri Lankan schools. Government shows they are so concerned about these issues by banning everything they believe that can harm our next generation :) Mobile phones at all schools are banned, Web Sites with explicit content are banned, screening of "Adults only" movies are banned. Ok, now government could say they have taken all necessary steps to groom our next generation to be well disciplined citizens, and future our country is guaranteed to be prosperous.

      However, government and its so-called advisers will never realize the root causes of all these problems. Their short sighted decisions and floating policies from the past, have aggravated these problems to this level and none of their decisions would help to change the situation in the long run.

      I believe Sri Lankan education system is screwed in big time! Kids are thrown in to a rat race from the kingdergarden, when they don't even have a slight clue on where they are heading. Not to mention, even after going through all the steps of primary, secondary and tertiary education, more than 80% of them still don't have an idea why they ran all these years. It leaves a big question whether do we have to run at all?

      Aside from the spoon-fed knowledge only selfishness, insensitivity, jealousy and hunchback (after carrying a school bag of 4KGs) are the only gains of this current education system. Why Sri Lankan education system failed so miserably in building citizens with self-confidence that they are someone who is adding value to the society? Why can't someone be a janitor, carpenter, factory worker, farmer, dancer, sportsman or a doctor and still feel they are all equal in the society?

      This false social grading starts from the primary school admissions. It's only the kids of the rich and so called elites will be admitted to the popular schools. No matter how closer you live to the school, your child would not be admitted if you cannot afford to give a hefty donations to school's development fund or if you don't have enough civil power and political influence. From year 1 these kids starts associating only with a certain social layer and will never understand their is another way of life lower or higher than them. They will measure the quality of their lives relative to these layers. Basically, layer above them are the most superior, powerful and layer below them are the inferior, wretched. They will never understand all these social layers has their own mix of good and bad.

      Next biggest mistake is the mis-interpretation of aesthetics and extra-curricular activities in schools. You are not allowed to sing or a dance, unless you want to take part in Derana Little Star. If you are not a play cricket, if you cannot select for college XI. Talking about myself, I had no skill on any sport or aesthetics. Still, I went to football practices, bloody well knowing that I will never be selected for the college team. I participated in drama, dancing and singing practices for cultural day in every year, though I only got the chance to be on stage handful of times. Later I learnt it wasn't my talent. But looking back today, the experience and lessons learnt through those activities are impossible to gain by just sitting in a classroom. The negligence of extra curricular activities in schools is also a main cause of the unfortunate incidents we hear today. I know some schools cancelling Sports Meets, Cultural Days to finish the syllabuses on time. Can we call such places as schools?

      There are more stuff running through my mind, but I will stop this rant at here. What I want to stress is whether you are a govt. official, principal, teacher, parent, got a sibling or even a total outsider - please pay your attention to the root causes and be aware of what's really happening at schools. I'm sure none of you would want to hear more unfortunate incidents.

      ]]>
      Independent Thinking http://laktek.com/2009/08/08/independent-thinking http://laktek.com/2009/08/08/independent-thinking/#comments Fri, 07 Aug 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/08/08/independent-thinking Independent thinking and self-concious decision making is what builds a person and the society. Yet, it's the most discouraged, criticised and often punishable act that a person can do. Our culture has done a nice job by misinterpreting and abusing values - such as obedience, loyalty and teamwork to suppress the importance of independent thinking.

      This starts to happen from our birth, where parents tries to give extra protection and care all the time. They will not let their eyes off the child, would not allow the child to touch anything or to play at his desire. This may be purely unintentional and due to their excess love for their child. However, unconsciously when they continue this beyond the limits, they do actually harm their child by blocking the creative sense and opportunities for self-realization. Afterall, humans are not weak as we seem.

      As I discussed in the previous post, schools to exert enough pressure to kill off the rest of the independent thinking capabilities within a person. This will continue to happen until the exam oriented education structure vanishes and people realises the value of each other irrespective of the educational or social background he comes from. Sir.Ken Robinson nicely presents this point in this TEDTalk. Take some time to watch it, if you haven't seen it before.

      Things get worse, when you enter into higher education, where you expect the independent and critical thinking is to be fostered. You are guaranteed to have poor grades if you are to challenge or tries to explore beyond what is taught. Parrotised lecture notes should be vomited on the paper if you want higher grades (is it the lack of knowledge or envy is still a puzzle). With the beliefs of higher the GPA, higher is your salary, nobody doesn't seem to be bothered to diss the current knowledge system. These professionals are so vulnerable to change and would never encourage their sub-ordinates to change. This results with a legacy knowledge system that is incapable of solving today's problems.

      When it comes to politics, corporate business or any other form of community activities you see the obvious. There is very little room (or actually no room) for independent thinkers. You're assured to be sidelined, mocked, harassed and in worst case even to pay the penalty with your life, if you are to hold a different point of view from the so called majority (which is actually a minority, which has exploited the power and force to grab the blind following of the rest, who have been trained not to use their wit by the earlier systems).

      Just think about it independently ;)

      ]]>
      First Meetup of LK Ruby User Group http://laktek.com/2009/10/01/first-meetup-of-lk-ruby-user-group http://laktek.com/2009/10/01/first-meetup-of-lk-ruby-user-group/#comments Wed, 30 Sep 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/10/01/first-meetup-of-lk-ruby-user-group Last Wednesday (30th September 2009), the first-ever meatspace gathering of Sri Lankan Ruby Users was held at Ridgecrest Asia (Pvt) Ltd. There were more than 20 passionate, enthusiastic Rubyists filling the room and I would call it a promising start.

      For several years, myself personally knew only handful of Rubyists in the country. Though, we had shared the love for the language, we doubted whether we could anticipate large adoption of Ruby culture in Sri Lanka and ever have a active community going here. One of the main reasons for that was at that time there were no mainstream industry demand for Ruby. There were only couple of startups doing Ruby (Rails) based development and very few of developers had the freedom to choose their development toolbox by themselves. So someone choosing Ruby as their main language of choice was a rarity.

      However, in the last couple of years things have started to change. Globally, Ruby has received a mainstream adoption and success of Rails made it a de-facto consideration when it comes to web apps. This has made Sri Lankan developers and firms to think about Ruby more seriously. We've seen several new interesting Ruby based projects coming up, also, heard several firms considering to migrate their legacy code to Ruby. Overall, these are great signs promising some exciting times ahead to aspiring Ruby developers in the country.

      Unlike Java, .NET or other commercial mainstream platforms Ruby developers are not gauged through professional certifications or training programs. As Matz believed, people should be able to express themselves freely when programming. This is something that cannot be trained or teached, only way one could absorb these Rubyisms is through passion and practice. That's the key difference between a Rubyist and other commercial developers. But Ruby community believes in collective efforts and in helping each other to groom themselves.

      The main idea of forming a Sri Lankan Ruby User Group (LK-RUG), was to help the developers to be inspired. It is harder to be inspired while working in isolation, especially when you are starting to grasp things. A gathering like this could help the people to share what they learnt from their experiences, while learning few tips and tricks from others. Behind many great rubyists there is a community which helped them groom. I believe same could happen in this country too. And the very first meeting gave positive signs for that. It was informal, friendly and very enthusiastic gathering. Let's hope we could maintain the same spirit in the future meetings too.

      So, if you already hacks with Ruby or eager to learn about Ruby culture, join with the Sri Lankan Ruby User Group and participate in the future meetings.

      BTW, Here are the slides from my presentation on "Evolution of Rails", which was done in the first meeting.

      P.S. Special Thanks to Sameera Gayan, for coordinating the event and Ridgecrest Asia (Pvt) Ltd. for offering the the location for the meeting.

      Update (05/10/2009):

      Gaveen's thoughts on the meetup

      Flickr Photoset of the meetup (uploaded by Gaveen)

      ]]>
      Can an Introvert be a better leader for Sri Lanka? http://laktek.com/2009/12/05/can-an-introvert-be-a-better-leader-for-sri-lanka http://laktek.com/2009/12/05/can-an-introvert-be-a-better-leader-for-sri-lanka/#comments Fri, 04 Dec 2009 16:00:00 GMT Lakshan Perera http://laktek.com/2009/12/05/can-an-introvert-be-a-better-leader-for-sri-lanka This morning, I came across an interesting article in Forbes titled “Why Introverts Make the Best Leaders”. It gives some really good reasons why introverts could lead better than their extroversive counterparts, who are normally considered as the natural leaders.

      Being an introvert myself I know, majority of the people have the perception that introverts are just bunch of obnoxious people, whom they don’t like to interact and would never consider as leaders. Surprisingly, as the article points out world’s most successful business moguls, including Bill Gates and Warren Buffet are introverts. Even in Sri Lankan business context, there are successful personalities like Dr.Hans Wijayasuriya (CEO of Dialog), who I believe is an introvert.

      Can an introvert rule a country? They considers, Barrack Obama is somewhat an introvert. How about Sri Lanka? Apart from Late. President J.R.Jayawardena, I don’t see any of our leaders, who can even slightly fits into the "introvert" label. Maybe, an introvert can never gain enough popularity to be the leader, in our highly extroversive society.

      But I believe, an introvert would be better suited as the leader of our country. Here are some of the reasons, which makes me believe on it.

      1. As the above article points out, Introverts do think first, talk later. Compare this with our leaders, who spits all sorts of bullshit that comes to their tongue. Have we ever had a leader who could stand on his words?

        Actually, this is also the main reason why introverts have become unpopular in an talkative extroversive world. Introverts don’t talk much, but when they do, they really know what they tell.

      2. Introverts inspires and motivates themselves from their work, rather than the social popularity or material wealth. This gives them a better chance to achieve their goals.
      3. When it comes to decision making introverts are more intuitive. They do not make decisions out of feelings. Naturally, they got a better sense and self-belief on what they could do, and would not tend to reverse their decisions due to external facts.
      4. They are good listeners. By a good listener, it doesn’t mean anyone can say anything and influence them on the decision making. Introverts will know how to separate grain from the salt, because they do listen from the mind, not from the heart.
      5. Introverts has the ability to understand the capacity of his subordinates better. This will make sure the right people are used for the right task, rather than appointing people based on the personal relationships or trust (which has been a huge mistake).

        Also, our country needs to involve intellectuals actively for policy making, if we want to achieve a sustainable development. Most of the intellectuals are also naturally introverts. Introverts feel more comfortable to work with another introvert, than an extrovert.

      6. ]]> Interesting stuff to watch out in 2010 http://laktek.com/2010/01/02/interesting-stuff-to-watch-out-in-2010 http://laktek.com/2010/01/02/interesting-stuff-to-watch-out-in-2010/#comments Fri, 01 Jan 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/01/02/interesting-stuff-to-watch-out-in-2010 We are already into the 2nd decade of 21st century and it is very evident that this would be the decade, where Cloud Computing, Realtime web and Mobile Web will start to rule!

        As per some buzzword fanatics, this year (2010) will be the transition year from web 2.0 to web 3.0. Buzzwords aside, as a web application developer I too expect to see the rise and mainstream adoption of some very interesting technologies during this year.

        Web Sockets

        Remember how AJAX changed the face of web in 2000? As I see Web Sockets would be the new AJAX of 2010. It's actually the next step from AJAX in improving the face of web. Web Sockets would allow two way communication between the browser and the server. With this, HTTP will no longer would behave as an stateless protocol. Actually, web sockets API comes as an upgrade to HTTP protocol in HTML5 specification, but already lot of Browser vendors and server developers has shown their interest and started adding support for the web sockets API. Currently, Google Chrome Beta version supports the cleint-side Web Socket API. In 2010, we can expect other browser vendors including Mozilla to support Web Socket API, enabling web app developers to come up with richer real-time user experience.

        HTML5

        Apart from WebSockets, there are lot of other interesting developments in HTML5 specification awaiting to get mainstream adoption in this year. Many new browsers has started to support audio/video elements, which would allow us to finally ditch dirty proprietary plugins (i.e. Flash, QuickTime & etc). Other interesting features in HTML5 specification includes offline data access, and geolocation API which would be really vital for improved user experience of mobile web. Google already utilise these features in their mobile web apps, which would give a big boost for the widespread adoption.

        Rails3

        Exactly after a one year from the announcement, we are finally getting to see the fruits of epic merge between Rails & Merb. Much faster, modular and extensible version of our favourite web framework is almost ready to be released as Rails3, within this year. To get more details on improvements in Rails3 please follow the blog series written by Yehuda Katz in Engine Yard blog.

        NoSQL Movement (Scehma-less Databases)

        Last decade, we only heard big boys like Google(Big Table), Amazon(Dynamo) are using schema-less key-value data storages. However, projects like MongoDB, Redis, CouchDB & Tokyo Cabinet is giving the opportunity for us to get a taste of it. Schema-less databases are proving to be really flexible over traditional relational databases for certain types of projects. NoSQL movement will surely gain more steam in 2010, so ignore it at your peril!

        Git

        You may wonder isn't Git already a mainstream technology from the last decade? It's true that its used to manage the world's largest FOSS project, Linux. But the real power of Git is beyond from a Distributed Version Control System. GitHub is becoming very popular, which is a business model entirely based on Git. Certainly, Git has opened up a new dimension in collaborative development and distributed file systems. I believe there are lot of other uses of Git as a simple CMS to mange your personal blog to distributed data mining of large projects. If you haven't checked out Git yet, I recommend you to add it as one of your todos for this year.

        Node.js (server-side javascript)

        Concept of server-side JavaScript is pre-dates back to 1990s, to the days where Netscape used it as a scripting language in their LiveWire servers. For two decades JavaScript couldn't extend its client-side reign in to server-side environments. However, the release of Node.js, which is an evented I/O for V8 javascript engine, has again made JavaScript a strong contender as a server-side development language. Node.js differs from traditional call-stack based frameworks by having a non-blocking API, which is strongly supported by callback based & evented nature of JavaScript. If you never cared to understand JavaScript and thought jQuery could save your day, now there are better reasons to dig deeper into the world's most misunderstood language.

        What are other fascinating technologies, you would keep an eye in this year?

        ]]>
        Understanding Election Results through Economic Theory of Democracy http://laktek.com/2010/01/28/understanding-election-results-through-economic-theory-of-democracy http://laktek.com/2010/01/28/understanding-election-results-through-economic-theory-of-democracy/#comments Wed, 27 Jan 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/01/28/understanding-election-results-through-economic-theory-of-democracy Though its been two days since the announcement of the results, still there is no end to the spread of speculations & rumors on the concluded Presidential Election in Sri Lanka. I don't have any strong political bias to either of the main two candidates and didn't want to accept anything reported in a blind eye.

        This morning, I came across an interesting Political Science literature written by Anthony Downs, named "An economic theory of democracy". I only could read the WikiSummary of it, but while going through it, my mind eventually mapped it to the context of the Sri Lankan Presidential Election. It helped me to dispel some of the doubts that were in my mind and understand how majority of the people would have voted.

        I thought of jotting them down here for others who are interested. Also, I hope someone, who is more knowledgeable on this subject would correct me if my line of thoughts was wrong (I haven't done any formal study on Political Science than casual reading of the stuff here and there).

        Downs defines basic logic of voting as follows:

        In a world of perfect information, each voter would compare his expected utility of having party A (incumbent) in government (for another term, that is) with the expected utility of having party B (opposition) in government. This utility differential would determine each voter's choice at the ballot box.

        In our case, party A would be current president Hon.Manhinda Rajapakse and party B is opposition's common candidate Gen.Sarath Fonseka.

        Further Downs mentions there are several factors that would matter to a typical voter and would modify the above model. Let's consider each of those factors.

        1. He doesn't really know what the future holds, so he doesn't know which party's rule will give him greater utility (in the future). So instead, he will instead compare the utility he got over the last term from party A with what he thinks party B would have provided under the same circumstances; if he thinks party B would have brought him more utility, he votes for B.

        I think from the first Presidential Election what mattered to most of the Sri Lankan average voter was the civil war with LTTE in north & east. Hence, the end of war, the biggest utility he got during the last term of party A (i.e. Mahinda Rajapakse). Though, Gen.Sarath Fonseka played a major role in war victory, I feel due to the other parties involved with him (and their broken promises on ending the war in the past), would have prevented majority from voting to party B over A.

        2. He doesn't just look at raw utility differential, though; he also considers the trend. Is A getting better or worse? If it is getting better, then the voter will forgive A for early failures to deliver utility.

        This is the point government (party A) did exploited within the last 2 months after the announcement of the election. Fy-over bridges, International Stadiums, Power Plants and all sorts of the other development efforts started to blossom within this period. Also, fuel prices and price of other essentials were brought down tactfully. Moreover government again used peace as a lucrative belief for future development. Voters would have consider this as a positive trend. While opposition clearly showed the government (party A) has failed to fulfill many of the promises in last election manifesto (Mahinda Chintanaya), voters seems to have forgiven for those failures.

        3. If A and B would have provided equal utility, the voter asks himself whether B would have used identical approaches and policies as A, or different ones? * If B would have been identical, the voter is indifferent and abstains.

        Apart from ending corruption and government wastage opposition(party B) didn't have vastly different approach when it comes to solving other problems (improving agriculture, education, healthcare or on how to create more job opportunities) which would have failed to convert subset of voters from their current stance.

        Also, as I see the above point could also explain the reason for the lower voter turnout in North and East. Both A & B didn't seem to have significantly different or new approaches when it comes to solving the prevailing problems of those areas. This would have led many from those areas to abstain from voting.

        * If B would have provided equal utility but by different means, then the voter concludes: "Okay, a vote for B is a vote for something to change, but a vote for A is a vote for no change." He then must evaluate whether change (generally) is a good thing. To make this evaluation, performance evaluations come into play. Based on the history he has seen of various parties governing in various circumstance, he asks himself, "How much utility would the ideal government have delivered me under the circumstances that A has governed in?" If A stacks up well in comparison, he votes against change (i.e. for A). If not, he votes for change (i.e. for B) and hopes for the best.

        Ending the corruption & wastage in government was the popular slogan of the Gen.Fonseka's campaign (party B). Here I believe again the voters were split into two sides based on the performance evaluations. Some thought Gen.Fonseka, who comes from a totally different background to political arena would surely bring in the change they want by getting rid of all the corrupted politicians. While, there were another bunch, who did look at the past (even considering the cases like Hitler, Idi Amin & etc) feared of he would turn into a dictator and decided to vote against the change. Apparently, based on the results it's evident the number belong to latter set was high.

        Finally, Downs mentions a very important statement in his writing:

        These decisions about utility, however, lack perfect information. He must estimate all these questions about utility based on the "few areas of government activity where the difference between parties is great enough to impress him". In other words, voters use information shortcuts;

        As I understand, for voters to use information shortcuts he must be able to get the true picture of the context. This is why existence of free, balanced & impartial media should really matter!

        ]]>
        Realie Project: Data Structure & Storage http://laktek.com/2010/02/11/realie-project-data-structure-storage http://laktek.com/2010/02/11/realie-project-data-structure-storage/#comments Wed, 10 Feb 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/02/11/realie-project-data-structure-storage Last couple of days, I found some time to work on my individual research project for the degree course. The topic area I selected for my project was on "Real-time Web". Real-time Web is just the opposite of the current way we use the web. Rather than we checking (polling) content providers for updates, content providers will feed (push) us with the updates. This concept is getting rapid adoption and I believe it would be the de-facto behaviour of the web in couple of years.

        The application I decided to build was a code editor based with real-time collaboration capabilities. Following the hacking traditions and for the ease of remembrance, this project was code named "Realie" (it was the zillionth time the name was changed and hope it would stick this time).

        Rather than developing the project behind closed doors and finally presenting a thesis (which would be utterly boring), I thought of building the project in open, by sharing the code and discussing design decisions with the community. I believe what it really matters is the experience and knowledge I could gather from this, rather than the final grade I would get for this.

        So lets start by looking at the initial data structure and storage decisions.

        What is Realie?

        Most of you may have heard or already using Google Wave & Etherpad. Realie also got inspired from those two projects. While, Google Wave & Etherpad are known to be real-time collaborative editors(or canvases) for general public, with Realie we try to cater the niche of hackers & developers. As we know development process is already a collaborative process, which involves lot of real-time communication & decision making. As developers we know this process is not seamless and painstaking. This is the void which Realie tries fill.

        Putting it to simple terms, Realie would be like pastie or gist where multiple people could view and edit at the same time. There would be more other jazzy features in the project, but this would be the essence of it. Imagine how such a tool would make your remote pair-programming, code reviewing & brainstorming sessions a breeze?

        Starting from the scratch

        As I was starting the project, Google acquired Etherpad and that action made Etherpad to release their code as open-source. Though, it sounded a perfect opportunity for me to fork their code and get my project done, I decided to start from the scratch. Etherpad is a grown project and they have already made certain design decisions. Adopting from them, wouldn't help me to gain any experiences on the implications of designing a real-time system or to explore better ways of doing things.

        One of the first challenges I had to face was on deciding the data storage mechanism I'm going to adopt for this project. My initial idea was to create each editing pad as a physical file in the system and track the changes each user would be doing using Git. This sounded very unrealistic as disk IO would be very slow and executing git commands via shell would be even more slower!

        Relational databases as storage medium doesn't seem to be a good for the task, since we should be doing massive writes and continuous querying. The best option in this scenario was to use a key-value based storage.

        Choosing Redis

        I got convinced to use Redis as the data store, after hearing lot of good about it and further seeing some impressive benchmarks.

        Redis is just beyond a normal key-value store, we could have lists, sets or sorted-sets as data structures in Redis and it's possible to do operations like sorting, taking difference, intersections and unions of the data. Also, one of the most interesting features I saw in Redis was it's persistence options. You can either use it as only a in-memory storage (which is very fast), or either write the data to disk periodically or even write data to disk only on a change (Append-only file)

        Data Structure

        The atomic data unit of Realie would be a Line. Each pad is composed of bunch of lines. A line would be also equivalent to a single edit a user make on a file (pad). When we are storing the lines, we will need to store the following attributes along with it - user, pad, content, position (line number in the file) and timestamp.

        In Redis, we can only store data values as strings. For this, each line will be serialized to JSON before being stored in the data-store. JSON serialization also makes it possible to consume & manipulate line contents in client or server easily.

        Since we need to keep references to a line in several places in the data-store, a unique SHA-1 hash based on the contents of the line is calculated and it's used as the key for that line.

        As I mentioned earlier, a Pad made as a collection of line. It's basically similar to any source files with lines of codes. Beyond that each pad will store the list of users who are working on that pad. Users will have the option to join a pad or leave a pad as they wish.

        For a pad, there are two basic views. One is snapshot view, which is the current state of the pad after applying the most recent changes made by the users. The interesting view would be the timeline view - which would have the changes in order of users made. This view can be used to generate the historical versions of the pad (at a given time or checkpoint) or even to create a playback to see how the pad was changed over time.

        Source Code

        You can checkout the code from following GitHub repository - http://github.com/laktek/Realie. Please note, currently the project source code only contains models & specs for above data structures and it could be changed, as you read.

        ]]>
        Building Real-time web apps with Rails3 http://laktek.com/2010/02/16/building-real-time-web-apps-with-rails3 http://laktek.com/2010/02/16/building-real-time-web-apps-with-rails3/#comments Mon, 15 Feb 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/02/16/building-real-time-web-apps-with-rails3 On deciding the web framework to build Realie, one of the main considerations was should I move to a totally asynchronous framework? Most established web frameworks, including my favorite Rails is built in a synchronous manner and follows a call-stack based model. Real-time web apps needs to be asynchronous. Evented programming model ideally suits to this.

        Since there were lot of hype around Node.js based async web frameworks in last couple of months, my initial idea was also to use such framework for my project. However, that felt as a totally new learning curve. Apart from grasping how to use JavaScript in server-side, it also meant I need to adopt to a totally new eco-system for templating, routing, etc.

        However, when I revisit my requirement it was clear only a part of the web app really needs to be asynchronous. Most parts can still be done with a traditional call stack based web framework. Using a fully async. web framework to build an entire app seems to be useless. Also, it felt an overkill to run two different apps to serve sync and async stuff.

        In this context, I came to know about Cramp, a Ruby asynchronous framework written by Pratik Naik. Best thing about Cramp is capability of using Rack middlewares (keep in mind not fully Rack compliant). Then came the option how about using Rails and Cramp together to build a hybrid of real-time web app. With Rails3, it makes it so easy to mix any other Rack endpoint with Rails. So this sounded a perfect solution to my problem.

        Since, Cramp follows evented model it needs an evented web server such as Thin or Rainbows!. Further, Cramp has implemented websockets support for these two server backends.

        Integrating Rails3 and Cramp

        First of all, you will need to bundle Cramp gem, with your Rails app. For this, open the Gemfile and add the following:

            gem "cramp", :require => 'cramp/controller'

        For my work I only needed the Cramp controller (it has also got an async model) so for now I only required it.

        As I mentioned earlier, to support web sockets, cramp needs to extend the web server we use. To specify the web server, I added an initializer (config/initializers/cramp_server.rb) with the following line :

            Cramp::Controller::Websocket.backend = :thin

        Then, I created a simple Cramp controller, which can respond to web sockets (app/cramps/communications_controller.rb)

            class CommunicationsController < Cramp::Controller::Websocket
                periodic_timer :send_hello_world, :every => 2
                on_data :received_data
        
                def received_data(data)
                    if data =~ /stop/
                        render "You stopped the process"
                        finish
                    else
                        render "Got your #{data}"
                    end
                end
        
                def send_hello_world
                    render "Hello from the Server!"
                end
            end

        Now the fun part! The new Rails3 router supports pointing to any Rack compatible endpoint, so we can easily hook our cramp controller for public access. In config/routes.rb add the following:

          match "/communicate", :to => CommunicationsController

        Our Cramp endpoint can co-exist with rest of the Rails controllers without any issues.

        Viola!

        Another important change with Rails3 is its also a fully compatible Rack app now. This means as any other Rack app, we can also start our Rails app by running rackup.

            rackup -s thin -p 3000 --env production

        This will start our app using Thin server backend on port 3000. Keep in mind, we need to provide an environment other than development, to avoid Rack Lint middleware. This is because Cramp is not a fully compatible with Rack SPEC and it will throw exceptions.

        ]]>
        I'm with OpenNebula this Summer! http://laktek.com/2010/05/01/im-with-opennebula-this-summer http://laktek.com/2010/05/01/im-with-opennebula-this-summer/#comments Fri, 30 Apr 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/05/01/im-with-opennebula-this-summer I had the opportunity to get selected for Google Summer of Code on the freshman year itself in my academic life. The experience I gained in that summer working with SilverStripe project boosted my self confidence and helped me immensely to shape up my career.

        This year, which happens to be my final year as an undergraduate, I'm going to get yet another stint with Summer of Code. This time, it's with OpenNebula. OpenNebula is an Open Source toolkit for cloud computing. This project is relatively young and small, but something which could make a great impact for the future. In simple terms, OpenNebula lets you run your own cloud hosting service like Amazon EC2.

        As a developer, who makes use of cloud platforms, I really want to see open standards getting adopted among cloud service providers. Vendor lock-in is the biggest threat I see when moving to cloud based platforms. Projects such as OpenNebula are great initiatives to avoid this. OpenNebula is also one of the main supporters for the OCCI standard interface. I have been following their developments closely, from the first day I got to know about them.

        When I saw OpenNebula has been selected as a mentoring organization for the first time in this year's GSOC, I thought this would be a great chance for me to contribute to them. Another interesting thing about OpenNebula is they use Ruby as their main development language.

        In this summer, I will work on building a web based administration console for OpenNebula. I believe it will make OpenNebula more usable and increase its adoption. It would be something similar to AWS(Amazon Web Service) management console, but more rich in terms of the capabilities.

        Though, we are still in the brain-storming phase of the project, I could give a small hint that we would be using Sinatra and other Rack middleware to build this. So it's yet another chance to show the power and flexibility of micro-frameworks. I will be sharing my experiences during the project via this blog. I hope it would be really fun and exciting.

        Finally, big shout-out goes to my mentor, Jaime Melis, who was very supportive from the time of preparing the proposal. It's always a pleasure to work with someone like him, who is really passionate and knowledgeable.

        If you are interested in reading my full project proposal : http://docs.google.com/Doc?docid=0AeUIyatONYiTZGY3ZnYyZmNfNjljanJyZDNjNg&hl=en

        ]]>
        Implementing Web Socket servers with Node.js http://laktek.com/2010/05/04/implementing-web-socket-servers-with-node-js http://laktek.com/2010/05/04/implementing-web-socket-servers-with-node-js/#comments Mon, 03 May 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/05/04/implementing-web-socket-servers-with-node-js Web Sockets are one of the most interesting features included in HTML5 spec. It would open up a whole different paradigm in web application development by allowing asynchronous, long-lived connections between client and server. As Web Sockets were supported in Google Chrome's beta release, it signaled now the time to use it in your apps.

        However, WebSockets doesn't really go well with the traditional synchronized web server environments. Use of evented libraries such as Node.js for Web Socket servers seemed more practical and scalable. But the initial versions of Node.js's didn't have built-in support for Web Socket connections specifically. There were several Web Socket server implementations based Node.js, which overcame this problem by hijacking its HTTP module.

        Node.js still doesn't include WebSockets in its core modules as in other languages (i.e. Go language). However, the recent overhaul to HTTP module, have made implementation of web sockets with whole lot easy. Now Node.js's HTTP server module would emit "Upgrade" event, each time a client requests a http upgrade. This event could be trapped when implementing a web socket server.

        Here is a simple, low-level example for a Node.js based HTTP server, which supports both common HTTP requests and web socket connections.

        var sys = require("sys");
        var net = require("net");
        var http = require("http");
        
        function createTestServer(){
          return new testServer();
        };
        
        function testServer(){
          var server = this;
          http.Server.call(server, function(){});
        
          server.addListener("connection", function(){
            // requests_recv++;
          });
        
          server.addListener("request", function(req, res){
            res.writeHead(200, {"Content-Type": "text/plain"});
            res.write("okay");
            res.end();
          });
        
          server.addListener("upgrade", function(req, socket, upgradeHead){
            socket.write( "HTTP/1.1 101 Web Socket Protocol Handshake\r\n"
                        + "Upgrade: WebSocket\r\n"
                        + "Connection: Upgrade\r\n"
                        + "WebSocket-Origin: http://localhost:3400\r\n"
                        + "WebSocket-Location: ws://localhost:3400/\r\n"
                        + "\r\n"
                        );
        
            request_upgradeHead = upgradeHead;
        
            socket.ondata = function(d, start, end){
              //var data = d.toString('utf8', start, end);
              var original_data = d.toString('utf8', start, end);
              var data = original_data.split('\ufffd')[0].slice(1);
              if(data == "kill"){
                socket.end();
              } else {
                sys.puts(data);
                socket.write("\u0000", "binary");
                socket.write(data, "utf8");
                socket.write("\uffff", "binary");
              }
            };
          });
        };
        
        sys.inherits(testServer, http.Server);
        
        var server = createTestServer();
        server.listen(3400);

        There is a more high-level and elegant Web Socket Server library (http://github.com/miksago/node-websocket-server) in development by Micheil Smith. It is built utilizing the new HTTP library and compatible with the draft 76 of the Web Sockets spec (which includes bunch of security improvements).

        Here is an example, on how to implement a web socket server with the above mentioned library.

        var sys = require("sys");
        var ws = require('./vendor/node-websocket-server/lib/ws');
        
        function log(data){
          sys.log("\033[0;32m"+data+"\033[0m");
        }
        
        var server = ws.createServer();
        server.listen(3400);
        
        server.addListener("request", function(req, res){
          res.writeHead(200, {"Content-Type": "text/plain"});
          res.write("okay");
          res.end();
        });
        
        server.addListener("client", function(conn){
          log(conn._id + ": new connection");
          conn.addListener("readyStateChange", function(readyState){
            log("stateChanged: "+readyState);
          });
        
          conn.addListener("open", function(){
            log(conn._id + ": onOpen");
            server.clients.forEach(function(client){
              client.write("New Connection: "+conn._id);
            });
          });
        
          conn.addListener("close", function(){
            var c = this;
            log(c._id + ": onClose");
            server.clients.forEach(function(client){
              client.write("Connection Closed: "+c._id);
            });
          });
        
          conn.addListener("message", function(message){
            log(conn._id + ": "+JSON.stringify(message));
        
            server.clients.forEach(function(client){
              client.write(conn._id + ": "+message);
            });
          });
        });
        ]]>
        Apple is repeating the same mistakes from the past http://laktek.com/2010/05/21/apple-is-repeating-the-mistakes-from-the-past http://laktek.com/2010/05/21/apple-is-repeating-the-mistakes-from-the-past/#comments Thu, 20 May 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/05/21/apple-is-repeating-the-mistakes-from-the-past In the 1980s, Apple jumped out to an early lead in personal computers, but then got selfish. Steve Jobs, a notorious control freak, just could not play well with others. Along came Microsoft, with Windows, which was a knockoff of Apple's operating system. Microsoft partnered with everyone and today has 90 percent market share, while Apple's share lingers in the single digits. Today the battlefield is mobile devices, and just as before, Apple jumped out to an early lead. And just as before, Jobs got selfish. He won't support Flash, or any cross-platform tools—because he wants developers locked into his platform, and his App Store, where he collects a 30 percent commission. Daniel Lyons (Newsweek) - http://blog.newsweek.com/blogs/techtonicshifts/archive/2010/05/20/sayonara-iphone-why-i-m-switching-to-android.aspx#

        Clearly, Android is becoming the new Windows (or even better because it's Open Source). Same as in 1980s, where Microsoft knocked off Apple with an OS that would run on any platform, today it appears Google would do the same for mobile market.

        No matter how beautiful, people don't like stay inside walled gardens. Apple doesn't seem to learn this lesson.

        ]]>
        Real-time Collaborative Editing with Web Sockets, Node.js & Redis http://laktek.com/2010/05/25/real-time-collaborative-editing-with-websockets-node-js-redis http://laktek.com/2010/05/25/real-time-collaborative-editing-with-websockets-node-js-redis/#comments Mon, 24 May 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/05/25/real-time-collaborative-editing-with-websockets-node-js-redis Few months ago, I mentioned I'm developing a real-time collaborative code editor (codenamed as Realie) for my individual research project in the university. Since then I did couple of posts on the design decisions and on technologies I experimented for the project. After some excessive hacking, today I've got something tangible to share with you.

        Currently, I have implemented the essentials for real-time collaboration including ability watch who else is editing the file, view others edits, chat with the other collaborators and replay how the edits were done. You may think this is more or less similar to what Etherpad had - yes, it is! However, this is only the first part of the project and the final goal would be to extend this to a collaborative code editor (with syntax highlighting, SCM integration).

        Web Sockets

        The major difference of Realie from other Real-time collaborative editors (i.e. Etherpad, Google Docs & Wave) is it uses web sockets for communication between client and server. Web Sockets perfectly suit for cases like this, where we need to have asynchronous, full-duplex connections. Compared to alternatives such as long-polling or comet, web sockets are really efficient and reliable.

        In traditional HTTP messages, every message needs to be sent with HTTP headers. With web sockets, once a handshake is done between client and server, messages can be sent back and forth with a minimal overhead. This greatly reduces the bandwidth usage, thus improves the performance. Since there is an open connection, server can reliably send updates to client as soon as they become available (no client polling is required). All this makes the app truly real-time.

        As of now, only browser to implement web sockets is Google Chrome. However, I hope other browsers would soon catch up and Mozilla has already shown hints for support. There are also Flash based workarounds for other browsers. For now, I decided to stick with the standard web socket API.

        Taking Diffs and Applying Patches

        In case if you wonder, this is how the real-time collaboration is done:

        1. when one user makes a change; a diff will be created for his change and sent to server.
        2. Then the server posts this diff to other connected collaborators of the pad.
        3. When a user receives a diff, his content will be patched with the update.

        So both taking diffs and applying patches gets executed on the client side. Handling these two actions on browser was made trivial thanks to this comprehensive library written by Neil Fraser.

        However, on some occasions these two actions needs to be executed concurrently. We know by default client-side scripts get executed in a single thread. This makes execution synchronous and slow. As a solution to this I tried using the Web Workers API in HTML5 (this is implemented in WebKit & Mozilla). Separate worker scripts were used for taking diffs and applying patches. The jobs were passed on to this worker scripts from the main script and the results were passed back to main script after execution was complete. Not only this made the things fast, but also more organized.

        Node.js for Backend Servers

        Initially, I started off the server implementation in Ruby (and Rails). Ruby happend to be my de-facto choice as it was my favorite language and I had enough competency with it. However, soon I was started feeling Ruby was not the ideal match for such asynchronous application. With EventMachine it was possible to take the things to a certain extent. Yet, most of the Ruby libraries were written in a synchronous manner (including Rails), which didn't help the cause. As an alternative, I started to play with Node.js and soon felt this is the tool for the job. It brings the JavaScript's familiar event-driven model to server, making things very flexible. On the other hand, Google's V8 JavaScript engine turned out to be really fast. I decided to ditch the Ruby based implementation and fully use Node.js for the backend system.

        Backend is consist of two parts. One for serving normal HTTP requests and other for web socket requests. For serving HTTP requests, I used Node.js based web framework called Express. It followed the same ideology as Sinatra, so it was very easy to adapt.

        Web socket server was implemented based on the recently written web socket server module for Node.js by Micheil Smith. If you are interested to learn more about Node.js' web socket server implementation please see my earlier post.

        Message delivery with Redis Pub/Sub

        On each pad, there are different types of messages that users send on different events. These messages needs to be propagated correctly to other users.

        Mainly following messages needs to be sent:

        • When a user joins a pad
        • When a user leaves a pad
        • When a user sends a diff
        • When a user sends a chat message

        For handling the message delivery, I used Redis' newly introduced pub/sub implementation. Every time a user is connected (i.e. visits a pad) there would be two redis client instances initiated for him. One client is used for publishing his messages, while other will be used to listen to incoming messages from subscribed channels.

        Redis as a persistent store

        Not only for message handling, I also use Redis as the persistent data store of the application. As a key-value store Redis can provide fast in-memory data access. Also, it will write data to disk on a given interval (also, there is a much safer Append only mode, which will write every change to disk). This mechanism is invaluable for this sort of application where both fast access and integrity of the data matters.

        Another advantage of using Redis is the support for different data types. In Realie, the snapshots are stored as strings. The diffs, chat messages and users for a pad are stored as lists.

        There is a well written redis client for Node.js which makes the above tasks really simple.

        Try it out!

        I'm still in the process of setting up an online demo of the app. Meanwhile, you can checkout the code and try to run the app locally.

        Here is the link to GitHub page of the project - http://github.com/laktek/realie

        Please raise your ideas, suggestions and questions in the comments below. Also, let me know if you are interested to contribute to this project (this project is open source).

        ]]>
        Are you creating software to impress one person? http://laktek.com/2010/05/29/dont-create-software-just-to-impress-one-person http://laktek.com/2010/05/29/dont-create-software-just-to-impress-one-person/#comments Fri, 28 May 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/05/29/dont-create-software-just-to-impress-one-person Ever wondered why we have so much of crappy and bloated software? The root cause they are built just to impress one person.

        This is a widespread disease in software industry. At the academic level, you will find students writing software to impress their mentors and get the required credit. Then you get the developers working for large software firms, who are only concerned on how to get the nod of their pointy-headed bosses. Freelancers are only worried of getting the sign-off from their pesky clients. You may expect startups to their stuff out of passion. But in reality, most startups which runs on funding are building software just to impress their VCs. This system is plain wrong!

        There are no software intended to be used by one person. In most cases, the person who is getting impressed is not the actual end-user of the software. He may not have a clue of what end-user really wants.

        If you are a developer, think of the wider audience, who'd be actually using your stuff. Don't ignore them! Try to impress those people at the end of the day.

        ]]>
        Handy Git commands that saves my day http://laktek.com/2010/06/04/handy-git-commands-that-saves-my-day http://laktek.com/2010/06/04/handy-git-commands-that-saves-my-day/#comments Thu, 03 Jun 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/06/04/handy-git-commands-that-saves-my-day There are 3 essential weapons every developer should have in their armory. A text editor, a terminal and a source code management system. Picking powerful, flexible and personalized tools will make your workflows more productive. As a developer, I use Vim for text editing, bash as the terminal and Git for source code management.

        Out of those, Git turns out to be the most fascinating piece of software to me. It's more than a SCM system. It represents a paradigm shift on the way we code. It's decentralized nature, gives freedom to experiment and innovate, without having worry about others' work. It brings sharing and collaboration of work to a new level. It's like the democracy in coding!

        Basic understanding of pull-commit-push cycle of Git may be sufficient for most daily ethos. However, there are plethora of other options in it which deserves some time for comprehension. Here are some of such commands, which I found useful and use in my regular workflows.

        git-rebase

        When I first started to use Git, my workflow used to be like this:

          git pull origin master
          git checkout -b branch_for_new_feature
          git status
          git commit -am "commit message"
          <--cycle from step 2-4, until my work is complete-->
          git checkout master
          git merge branch_for_new_feature
          git push origin

        However, on many occasions when I try to push the changes to remote server(origin), I will end up with the following error:

          ! [rejected] master -> master (non-fast forward) error: failed to push some refs to 'ssh://user@server:/repo.git'

        This is because my colleagues have also pushed to the remote server, while I was working on the new feature. At this point, I could simply do a git pull to update my local repo, which would merge the remote changes with the current HEAD. On most cases, this leads to a chain of conflicts, which requires manual resolution. Due to my laziness (or lack of concentration), I often end up deleting what was on the HEAD and replace it with the upstream changes. Mid way, I realize I was actually deleting the new updates on the HEAD that were supposed to be pushed to the remote server. From that point onwards, cleaning up the mess involves pulling out my hair and lot of swearing!

        Actually, there is a smarter way to do this. That is to use git-rebase. When you do a rebase, it saves all commits in the current branch that are not available in the upstream branch to a staging area, and reset the current branch to upstream branch. Then those saved commits would be reapplied on top of the current branch, one by one. With this process, it ensures my newest changes would remain as the newest.

        The new workflow with rebasing would be:

          git pull origin master
          git checkout -b new-feature-branch
          git status
          git commit -am "commit message"
          <--cycle from step 2-4, until work is done-->
        
          git checkout master
          git pull origin master #update the master before merging the new changes
          git checkout new-feature-branch
          git rebase master #apply the new changes on top of current master
        
          git checkout master
          git merge new-feature-branch
          git push origin

        Though it seems to be longer than the previous workflow, it helps me to stay away from unnecessary conflicts. Even if they do occur, resolution of them are pretty straight-forward, as I know for sure what is the newest change.

        git-cherry-pick

        While working on a new-feature-branch, I encounter quick fixes that are independent from the new feature; thus can be applied to master independently. Delaying the release of these fixes till new-feature-branch gets merged to master seems unnecessary. On such cases git-cherry-pick comes in handy. As the name implies, you can pick exactly one commit (by the ref-id) and apply it to another branch. To avoid conflicts, those commits should be self-contained patches. If it depends on another commit, you will require to apply it first.

        git-blame & git-show

        Some days, you wake up to find some one has changed your pet hack! Rather than blaming the cat for eating the code; you can easily find out who is the real culprit, by running:

            git-blame application.js

        It would return the author and last commit SHA for all lines in the file. You can even narrow down the output by specifying the starting and ending lines:

          git blame -L 450,+10 application.js

        If you want to do further investigation, such as why did the author actually made this change and what are the other patches he committed along with this, you can run:

          git show last_commit_sha

        git-bisect

        With git-blame you were able to track down the issues that is visible to the naked eye. But most freaking bugs are spread out and harder to detect at a glance.

        This is more common when you work as a team, each one would be working on modular sections and will have tests covering the code they write. Everything seems to be running smoothly, until you merge all modules together. Tests would start to fail and you are left with no clue what breaks the tests. On such instances, what we normally do is rollback the commits one by one, to find where it causes the trouble. But this can become a tedious process if you have large set of commits. In Git there's a handy assistant for this; it is the git-bisect command.

        When you specify a range of commits it will continuously chop them in halves, using binary-search until you get to the last known good commit. Typical workflow with git-bisect is as follows:

            git bisect start
            # your repo would be switched to a temporary 'bisect' branch at this point
        
            # you mark the current HEAD of the repo as bad
            git bisect bad
        
            # Then you set the last known good version of the repo
            git bisect good version-before-merge
        
            # This will start the bisecting process.
            # Commits from last good version to current version will be chopped in half.
        
            # Then you run your tests
            rake test
        
            # Based on output of the test you mark the commit as good or bad
            git bisect good/bad
        
            # Git will chop the next subset automatically, and return for you to test
        
            # Test and marking process, will continue until you end-up with a single commit,
            # which is supposed to be the one which introduced the bug.
        
            # When bisecting process is done; run:
            git bisect reset
        
            # You will be returned to your working branch.

        git-format-patch/git-apply

        Contributing to some open source projects is easy as sending a pull request via GitHub. But that's not the case with all. Especially in large projects such as Rails, you are expected to submit a ticket to the bug tracker attaching the suggested patch. You have to use the command git-format-patch to create such a patch, which others can simply apply to their repositories for testing.

        In the same way, if you want to test someone else's patches, you need to use the command git-apply.

        git submodule

        Git submodules allows you to add, watch and update external repositories inside your main repository. It's my preferred way of managing plug-ins and other vendor libraries in Ruby or Node.js projects.

        To add a submodule to your project, run:

            git submodule add repository_url local_path

        When someone else takes a clone of your repo, they will need to run;

            git submodule init
            git submodule update

        This will import the specified submodules to their environment. Deploying tools such as Capistrano has built-in support for git submodules, which will run the above two commands, after checking out the code.

        git help command

        Last, but not least I should remind you that Git has excellent documentation, which makes its learning process so easy. To learn the options and use cases of a certain command all you need to do is running:

            git help command

        Apart from the default man pages, there are enough resources on the web on Git including the freely available books: Pro Git and Git Community Book

        If you have any other interesting tips on using Git, please feel free to share them in the comments.

        ]]>
        The Best Role Model of Our Time http://laktek.com/2010/07/23/best-role-model-of-our-time http://laktek.com/2010/07/23/best-role-model-of-our-time/#comments Thu, 22 Jul 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/07/23/best-role-model-of-our-time "Who is the biggest role model of your life?" My answer to that question would be Muttiah Muralitharan. I know that answer would confuse most of you. You would expect a geek like me to name some one like Linus Torlvards, Yukihiro Matsumoto or Sergy Brin and Larry Page as a role model. But how come a cricketer be my role model??

        Murali

        In an era where vanity role models are hyped to the top by mass media, Murali stands out from the rest by his own feet. He is great not only for his phenomenal performances in the cricket field, but for his character. None of us could ever emulate his unique bowling action. But there are certain things that we can try to emulate from Murali's character.

        A Human!

        It was somewhere in 1998. As a kid, I had rather unusual hobby of collecting cricket statistics. Those days, I didn't know the existence of Cricinfo or didn't even had a computer. I used to record all the scorecards of the matches played during that time in an exercise book. However, my record collection was never complete, I missed lot of score cards of the old matches. Then my uncle tipped me, that the Sri Lankan Cricket Board has a library, where they have all old Wisdens and "The Cricketer" magazines. I could use it to collect the missing match records. After lot of persuasion, I was able to convince him to take me to this place.

        Inside the Cricket Board, we had to go pass a gym to get to the library. I saw a very familiar face inside the gym. It was Murali! He was there with another cricketer (as I remember it was Ravindra Pushpakkumara). That was the first time I saw an international cricketer in real life. Murali also saw me and waved. Then I tried to go inside the gym, to get his autograph. The instructor of the gym was there. He told me not to disturb the players and didn't allow me to go inside. As I was turning back in disappointment, the most surprising thing happened. Murali came to the door and signed my autograph book!

        This was the period, where Murali was called for chucking for the second time in Australia and he was preparing to undergo medical tests to prove he's legibility. So while he signed my autograph, I told him how angry I'm with Australians for the unjust happened to him. Returning my autograph back, with a bright smile in his face Murali said in Sinhalese "owa ohoma tamai malli..."(These stuff happens).

        I couldn't believe how humble and down to earth this person was. He was ready to go out of his way to make some random, pesky kid happy. He could still afford to smile genuinely and take things lightly amidst of all the trouble he has been experiencing at that time. Even today, when I reminisce this incident, it feels like a dream.

        A Geek!

        For me Murali is a geek. He's not a geek who uses Linux as his primary OS, lurks in IRC or hacks micro-controllers. But his passion and obsession to the game of cricket, makes him a geek in that field.

        He's not only a geek, but a hacker. He changed the face of off-spin bowling. When off-spinner was about to go extinct from the game of cricket, Murali came and made it more challenging. He forked the doosra from Saqlain Mustaq and hacked it into a more lethal weapon. By being different from the rest, he created controversy.

        Also, Murali only focused on doing what he can do best. He didn't have to ride Lamborghinis, have affairs with bollywood actresses (but he's got a beautiful wife ) or get into politics to keep him in the limelight. He made the world talk about him and respect him by doing what he can really do - bowling.

        A Workaholic!

        When Murali first won his test cap for the Sri Lanka team, it was not the professional, winning outfit you find today. At that time, Sri Lanka was ranked only ahead of Zimbabwe and playing for the national team was not even considered as a profession. Due to the political instability of the country during that time, there were no certainty of the tours and there were no policies for player selection. The future was gloomy and had lots of risk involved. I would say the situation was analogous to working for a startup in the corporate world.

        He could have easily stayed in Kandy looking after his family business. Instead Murali took the challenge and came to Colombo to join the national team. It wasn't an easy start. His brilliance was not an one night wonder. As the stats show it took 27 test matches and 3 years to complete his first 100 wickets. During that time, he bowled full days, without much support from the other end and tasted heavy defeats.

        Murali persisted and perceived harder. As his performances improved, so did the Sri Lankan team's winning ratios. However, he focused not on his personal feats, but on his team's victory. He had no problems playing under different captains, even juniors to him like Mahela and Sanga. He delivered his best in all circumstances. He never let his personal ego to hinder his duty.

        In the last 18 years, he had been working like a horse. As of record, he bowled 33% of all overs Sri Lanka bowled at that time. He had always made himself available for national duty over other more lucrative engagements, such as country cricket in England.

        How many of us can have such dedication and commitment to our duty? How many of us would complain if we have to repeat the same old boring job? Murali was no such a person. When he was on field he seems to be enjoying the every moment of it. That should be the secret mantra of Murali's success. That's why I call him the best role model of our era.

        Hail Murali!

        (Photo credit: Wikimedia Foundation - http://upload.wikimedia.org/wikipedia/commons/d/d4/MuralitharanBust2004IMG.JPG)

        ]]>
        Building Modular Web Apps with Rack & Sinatra http://laktek.com/2010/12/22/building-modular-web-apps-with-rack-sinatra http://laktek.com/2010/12/22/building-modular-web-apps-with-rack-sinatra/#comments Tue, 21 Dec 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/12/22/building-modular-web-apps-with-rack-sinatra Working on OpenNebula's Administration tool in last Google Summer of Code, was one of the best development experiences I had during 2010. The project has been successfully completed and awaiting to be released with a future version of OpenNebula.

        In this post, I would like to give some insights on its development, since I believe it stands as a good case study on how to build modular web apps, especially using Rack & Sinatra.

        Background

        Main objective of OpenNebula's Admin Tool is to enable easy management & monitoring of your OpenNebula cloud setup via a web based interface. Basically, this includes management of users, hosts, virtual networks and virtual machines(VM). It's planned to be extended further to offer features like infrastructure checks, installation and configuration tweaking of an OpenNebula setup (which are already in development).

        Also, it is expected to be self hosted, interfacing to an OpenNebula front-end server. It interacts with the OpenNebula using its Ruby API.

        In order to achieve these requirements, the application needed to be modular, self-contained and easily customizable. If we used an opinionated framework like Rails, we would be spending majority of the development time on tweaking the framework for the problem domain, rather than focusing on the problem domain itself. However, on the other hand, building such fully featured app from the scratch within a 3-month timeline was not also realistic.

        In this background I started exploring the possibilities of using a mini-frameworks(web DSLs), specifically Sinatra. From my mentor, I got to know that they have used Sinatra for certain parts of the OpenNebula project. So it was a safe bet to try for this context.

        Since, Sinatra inherently follows the concepts of Rack, its rich middleware stack can be used to bridge the functionality of the apps.

        Collection of Mini-Apps

        Looking at the overall app, it is composed of loosely coupled resource modules, which have minimal interaction or dependency between them. This made possible to contain each resource module in it's own mini app; which means adding, removing or customization of a module can be done without affecting the behavior of others.

        class HostsApp < Sinatra::Base
        
          #define the model
          require 'models/host'
        
          #define the views (based on mustache)
          register Mustache::Sinatra
          require 'views/hosts/layout'
          set :mustache, {
            :views => 'views/hosts',
            :templates => 'templates/hosts'
          }
        
          set :sessions, true
        
          get '/list', :provides => :json do
            Host.all.to_json
          end
        
          get '/list' do
            @hosts = Host.all
            @flash = flash
        
            unless @hosts.include?(:error)
              mustache :list
            else
              puts "Error: "+@hosts[:error]
              "<h1>Oops..An error occurred.</h1><p>#{@hosts[:error]}</p>"
            end
          end
        end

        Above, is an simplified example of how a mini-app is defined. It extends Sinatra::Base class and follows an explicitly defined MVC pattern. API calls are wrapped in a separate model class, while output generation is done using a Mustache based view templates. So it is basically similar to a controller in Rails.

        In above code block, you may notice there are two routes defined for GET /list path. Only difference is one route has a condition: provides. Which means it only responds to requests accepting JSON as the content type. This way we can offer different response types for same resource (i.e. an API) in Sinatra.

        Template Rendering

        As I mentioned earlier I used Mustache for generating views of the project. This was also the first time I used Mustache and I was really hooked with its flexibility.

        Mustache defers from traditional language specific templating schemes, by defining it's own logic-less template format. This makes it possible to reuse the same template on different contexts. For example, in this project I used the same template for server-side rendering with Sinatra and also again on client-side (with JavaScript), when data are loaded via a AJAX.

        Exploring Mustache's capabilities in detail would take a post of it's own, so I leave it for a future post.

        Routing

        In order to form a one high-level application, individual mini-apps with different end-points, needed to be mapped to a single address space.

        For this purpose, I used Rack::Mount, library written by Josh Peek, which also powers Rails3' default routing. It simply routes requests to individual Rack apps based on the path.

        This is how the route set for the Admin Tool looks like (which I hope is self-explanatory):

        # route paths to different apps
        Routes = Rack::Mount::RouteSet.new do |set|
          set.add_route UserSessionApp, { :path_info => %r{^/user_session*} }, {}, :user_session
          set.add_route HostsApp, { :path_info => %r{^/host*} }, {}, :host
          set.add_route VirtualNetworksApp, { :path_info => %r{^/vnet*} }, {}, :vnet
          set.add_route VirtualMachinesApp, { :path_info => %r{^/vm*} }, {}, :vm
          set.add_route UsersApp, { :path_info => %r{^/user*} }, {}, :user
          set.add_route DashboardApp, { :path_info => %r{^/$} }, {}, :dashboard
        
          #public file routes
          set.add_route Rack::File.new(File.dirname(__FILE__) + "/public"), { :path_info => %r{^/public*} }, {}, :public
        end
        
        # run the routeset
        run Routes

        User Authentication

        Another important concern of this project was how to enforce user authentication. Admin console access needed to be restricted by the login credentials defined by One Client of OpenNebula.

        There are several authentication middleware libraries available for Rack. Out of those, Warden seems to be the most flexible and well documented. Ability to define custom authentication strategies easily, also made it more suitable for our requirement.

        This is how the authentication strategy based on one_client was defined using Warden:

        Warden::Strategies.add(:password) do
          def valid?
            params["user_name"] || params["password"]
          end
        
          def authenticate!
            u = get_one_client
            (u.one_auth == "#{params["user_name"]}:#{Digest::SHA1.hexdigest(params["password"])}") ? success!(u) : fail!("Could not log in")
          end
        end

        Another interesting thing about Warden is it only invokes when we explicitly calls it. Otherwise it just remains as an object in Rack environment, without getting in the way of application execution. In order invoke Warden, we can call it within a before filter in Sinatra. Request processing will continue or halt depending on the authentication result.

        before do
          #check for authentication
          unless env['warden'].authenticated?
            session["return_to"] = request.path
            redirect "/user_session/new"
          end
        end

        Other essential Rack Middleware

        There are couple of other Rack middleware, that were used in this project, which provides some of the essential conveniences we have in Rails.

        One such middleware is Rack::NestedParams (available in Rack Contrib package), which is used handle nested form parameters properly. Also, Rack::Flash is useful, which gives the option of adding flash messages (success, errors and warnings) to the app.

        Source Code

        You can view the full source code of the OpenNebula's Admin Tool from its repository at http://dev.opennebula.org/projects/one-admin-tool/repository

        ]]>
        Looking Back at 2010 http://laktek.com/2010/12/31/looking-back-at-2010 http://laktek.com/2010/12/31/looking-back-at-2010/#comments Thu, 30 Dec 2010 16:00:00 GMT Lakshan Perera http://laktek.com/2010/12/31/looking-back-at-2010 It was a year of transition in my life. Some of the highlights of 2010 were:

        1. Completed my degree and marked the end of formal education.
        2. Saw CurdBee becoming a more established product (We shipped loads of new features in 2010).
        3. Completed another successful Google Summer of Code (probably my last) with OpenNebula project.
        4. Got to learn lots of new stuff (asynchronous processing, real-time communication, etc) by working on Realie project, which was my final year project in University.
        5. Increased traffic to my blog by 300% (but I should have blogged more)
        6. Bought my first car.
        7. Finally, Bought an iPhone :).

        Lots of interesting things have already planned for 2011. It's surely going to be a more challenging year ahead.

        ]]>
        jQuery isBlank() http://laktek.com/2011/01/07/jquery-isblank http://laktek.com/2011/01/07/jquery-isblank/#comments Thu, 06 Jan 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/01/07/jquery-isblank One of my favorite syntactic sugar methods available in Rails is the Object.blank?, which evaluates to true if the given object is false, empty, or a whitespace string. It makes your conditional expressions more readable; avoiding the use of boolean operators.

        It would be cool if we can have the same convenience when writing client-side code with JavaScript. Unfortunately, jQuery Core doesn't have such a utility function. Closest you get is with the jQuery.isEmptyObject. It would return true for null, undefined or empty objects and empty arrays; but you can't match whitespace strings with it (which are of course not empty objects).

        So, I wrote this small jQuery plugin to check whether the given object is blank:

        (function($){
          $.isBlank = function(obj){
            return(!obj || $.trim(obj) === "");
          };
        })(jQuery);
        
        $.isBlank(" ") //true
        $.isBlank("") //true
        $.isBlank("\n") //true
        $.isBlank("a") //false
        
        $.isBlank(null) //true
        $.isBlank(undefined) //true
        $.isBlank(false) //true
        $.isBlank([]) //true

        As shown in the above examples, it would identify any object that evaluates to false (null, undefined, false, []) or a whitespace string as blank.

        Update: jtarchie commented on this gist suggesting a alternative method, which would even match the empty objects.

        ]]>
        Understanding Prototypal Inheritance in JavaScript http://laktek.com/2011/02/02/understanding-prototypical-inheritance-in-javascript http://laktek.com/2011/02/02/understanding-prototypical-inheritance-in-javascript/#comments Tue, 01 Feb 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/02/02/understanding-prototypical-inheritance-in-javascript Behavior reuse is one of the key aspects of Object Oriented programming. Many mainstream Object Oriented languages, achieves behavior reuse by using class based inheritance. In class based inheritance, a class defines how objects stemming from it should behave.

        However, not all languages use class based inheritance to achieve behavior reuse. The best possible example is JavaScript. It doesn't have a concept of classes. Many developers often get confused about JavaScript's object oriented capabilities due to this fact. But in reality, JavaScript is a more expressive and flexible Object Oriented language compared to some of the mainstream languages.

        If JavaScript doesn't have class based inheritance, how does it reuse the behavior? For that it follows the technique called Prototypal Inheritance.

        In prototypal inheritance, an object is used to define the behavior of another object. Let's try to understand this with a simple example:

            var father = {
             first_name: "James", 
             last_name: "Potter",
             hair_color: "black",
             is_good_at_quidditch: true,
        
             name: function(){
              return this.first_name + " " + this.last_name
             }
            }
        
            var son = {
             first_name: "Harry" 
            }
            son.__proto__ = father;
        
            father.name()
            >> James Potter
        
            son.name()
            >> Harry Potter
        
            son.hair_color
            >> black
        
            son.is_good_at_quidditch
            >> true

        Here the 'father' object acts as the prototype for 'son'. Hence, 'son' inherits all properties defined for 'father' (Note the proto property of 'son' object was explicitly overridden to set 'father' as the prototype).

        Even though it was used as a prototype, 'father' object can be still manipulated as a regular object. This is the main difference of a prototype from a class.

        Object Hierarchy

        The process of object responding to a property call in JavaScript is fairly straight-forward. It will first check whether it defines the property on its own; if not it will delegate the property call to its prototype object. This chain will continue to the top of object hierarchy until the property is found.

        Talking about the object hierarchy, all objects in JavaScript are descended from generic Object. The generic Object prototype is the default prototype set on all objects at the instantiation, unless a custom prototype object is defined.

        So any given inheritance hierarchy in JavaScript is chain of objects with the generic Object prototype at the root.

        Creating New Objects

        Though JavaScript doesn't have classes, you can define a constructor function and call it with the new keyword to instantiate a new object. As I mentioned before, when the new object is created it uses the generic Object prototype as its prototype.

        Let's take an example of creating basic shape objects. The constructor takes the number of sides and vertices as the arguments.

        var Shape = function(sides, vertices){
          this.sides = sides; 
          this.vertices = vertices; 
        }
        var triangle = new Shape(3, 3);

        What if we want to create different types of triangles? Yes, we can use our basic shape object as the prototype for all our triangle objects.

        var Triangle = function(angles, side_lengths){
          this.angles = angles || [60, 60, 60]; 
          this.side_lengths = side_lengths || [5, 5, 5]; 
        }
        Triangle.prototype = new Shape(3, 3);
        
        var isosceles_triangle  = new Triangle([70, 70, 40], [5, 5, 10]);
        var scalene_triangle  = new Triangle([70, 60, 50], [5, 10, 13]);
        
        isosceles_triangle.sides
        >> 3
        
        isoceles_triangle.vertices
        >> 3
        
        scalene_triangle.sides
        >> 3
        
        scalene_triangle.vertices
        >> 3

        Basically, when you call a constructor function with the new keyword; it will set the proto property of the newly created object to the object defined in prototype property of the constructor function.

        Modifying Prototype Object at Runtime

        All Objects in JavaScript can be modified during the runtime. Since prototype objects are also regular objects, we can modify them too. However, when you modify a prototype object its changes are reflected to all its descended objects too.

          Triangle.prototype.area = function(base, height){
            return(1/2 * base * height);
          }
        
          isosceles_triangle.area(10, 4); 
          >> 20

        What's most interesting is we can use this way to extend the built-in objects in JavaScript. For example, you can extend String object's prototype to add a capitalize method.

            String.prototype.capitalize = function(){
              return this.charAt(0).toUpperCase() + this.slice(1);
            };
        
            "john".capitalize();
            >> John

        Further Reading

        If you like to learn more about JavaScript's object model and prototypal inheritance, you would find following articles/posts useful.

        • Details of the object model (MDC Doc Center)
        • Inheritance revisited (MDC Doc Center)
        • Classical Inheritance in JavaScript (by Douglas Crockford)
        • Prototypal Inheritance in JavaScript (by Douglas Crockford)
        • Simple “Class” Instantiation (by John Resig)
        • ]]> Introducing jQuery Smart AutoComplete... http://laktek.com/2011/03/03/introducing-jquery-smart-autocomplete http://laktek.com/2011/03/03/introducing-jquery-smart-autocomplete/#comments Wed, 02 Mar 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/03/03/introducing-jquery-smart-autocomplete Few months ago, we did a major revamp in CurdBee UI. In this revamp we decided to make use of autocomplete on several data heavy inputs to improve the usability. We assumed any existing JavaScript autocomplete plugin could be easily modified to suit to our requirements.

          Unfortunately, that wasn't the case. None of those plugins were comprehensive and flexible enough to cover the cases we had in our hand. Here are some of the issues we faced with the existing plugins:

          • They introduced whole bunch of new dependencies.
          • They didn't support custom filtering algorithms.
          • You cannot modify the HTML of results (also, styling the plugin generated markup becomes a nightmare).
          • You cannot specify what to do when there are no results.

          So I had to write a custom jQuery based autocomplete plugin to suit the requirements of CurdBee. Later, I used this same plugin on several other hobby projects with slight modifications.

          At this point, it made me realize this could be something that can be useful for others too. After couple of weeks of free-time hacking, I'm ready to introduce you to the Smart Autocomplete plugin!

          First of all, I suggest you to check different use cases of the plugin by visiting the demo page. If those examples pumps you, read on!

          Basic Usage

          Basic Autocomplete Screenshot

          Using Smart Autocomplete on your projects is easy. Only dependency it has is for jQuery core library. Make sure you have the latest jQuery core version (1.5 or above), if not you can download it from here.

          To get the Smart Autocomplete plugin visit its GitHub page.

          Once you have downloaded both jQuery and the Smart Autocomplete plugin, reference them in the header of your page like this:

            <script src="https://ajax.googleapis.com/ajax/libs/jquery/1.5.1/jquery.js" type="text/javascript"></script>
            <script src="jquery.smart_autocomplete.js" type="text/javascript"></script>

          Now, define an input field to use for autocomplete.

            <div>
              <label for="fruits_field">Favorite Fruit</label>
              <input type="text" autocomplete="off" id="fruits_field"/>
            </div>

          To enable autocompletion on the text field, you must select it using jQuery and call the smartAutoComplete method on it:

            $(function(){
          
             $('input#fruits_field').smartAutoComplete({
              source: ['Apple', 'Banana', 'Orange', 'Mango']
             });
          
            });

          As you can see in the above example, only required option is the source. Source defines the set of values that will be filtered when user starts typing on the field. Source can be either an array or a string specifying a URL. If you specify a URL, filter function will send an AJAX request to that URL, expecting a JSON array as a response (check Example 2).

          You will also need to add couple of styles to display the results correctly.

              ul.smart_autocomplete_container li {list-style: none; cursor: pointer; margin: 10px 0; padding: 5px; background-color: #E3EBBC; }
              li.smart_autocomplete_highlight {background-color: #C1CE84;}

          Once you complete the above steps, your text field will have autocomplete capabilities.

          Power of One-liners

          Though Smart Autocomplete comes with sensible defaults, you can easily customize the default behavior by setting couple of one-line options.

          If you check the Example 2, you will notice the autocomplete field is defined as follows:

              $("#country_field").smartAutoComplete({ 
                source: "countries.json", 
                maxResults: 5,
                delay: 200,
                forceSelect: true
              });

          Apart from the required source field, it has 3 other options defined. The maxResults option sets the maximum number of results to be shown; delay option sets number of milliseconds plugin should wait (after user enters a key) before calling the filter function.

          By setting forceSelect option to true, you can block free-form values in the autocomplete field. This means user have to either select a value from the given suggestions or the field will reset to blank.

          You must have seen how Google Instant Search completes rest of the phrase in gray, with the best matching result. It's possible to implement similar behavior with Smart Autocomplete too.

          typeAhead screenshot

            $("#type_ahead_autocomplete_field").smartAutoComplete({
              source: ['Apple', 'Banana', 'Orange', 'Mango'],
              typeAhead: true
            });

          All you got to do is to set typeAhead option to true. Isn't that easy?

          You can find all available options of the plugin in the README.

          Define your own Filtering Algorithm

          Smart Autocomplete plugin gives you the flexibility to override the built-in filter function with a custom function. Custom filter function should return an array of results or a deferred promise, which returns an array on resolve. If you call jQuery Ajax methods, those will return an object containing a promise by default.

          In the 5th example, we use Quicksilver like filtering to filter the names of the Jazz musicians.

          Jazz Musicians Screnshot

          We use the JavaScript port of Quicksilver string ranking algorithm, which adds score() method to String prototype, to filter the items.

          Here's how the smartAutoComplete method is called on the field, with our custom filter function:

            $("#jazz_musicians_field").smartAutoComplete({
                  source: [
                    "Scott Joplin",
                    "Charles Bolden",
                     //snip 
                  ],
          
                  filter: function(term, source){
                      var filtered_and_sorted_list =  
                        $.map(source, function(item){
                          var score = item.toLowerCase().score(term.toLowerCase());
          
                          if(score > 0)
                            return { 'name': item, 'value': score }
                        }).sort(function(a, b){ return b.value - a.value });
          
                      return $.map(filtered_and_sorted_list, function(item){
                        return item.name;
                      });
                    }
          
                  });

          Do more with Events

          Smart Autocomplete was written following an event-driven approach and therefore it emits events at the every major step in its process. You can bind handlers to these events similarly to other jQuery Events. Also, it's possible to cancel the default behavior on events by calling ev.preventDefault() method. This makes extension and customization of Smart Autocomplete so easy.

          In example 6, we make use of the evented model of Smart Autocomplete to build multi-value selector.

          Multi Select Screenshot

          Let's see how the implementation is done:

            $("#textarea_autocomplete_field").smartAutoComplete({source: "countries.json", maxResults: 5, delay: 200 } );
            $("#textarea_autocomplete_field").bind({
          
               keyIn: function(ev){
                 var tag_list = ev.customData.query.split(","); 
                 //pass the modified query to default event
                 ev.customData.query = $.trim(tag_list[tag_list.length - 1]);
               },
          
               itemSelect: function(ev, selected_item){ 
                var options = $(this).smartAutoComplete();
          
                //get the text from selected item
                var selected_value = $(selected_item).text();
                var cur_list = $(this).val().split(","); 
                cur_list[cur_list.length - 1] = selected_value;
                $(this).val(cur_list.join(",") + ","); 
          
                //set item selected property
                options.setItemSelected(true);
          
                //hide results container
                $(this).trigger('lostFocus');
          
                //prevent default event handler from executing
                ev.preventDefault();
              },
          
            });

          As shown in the above code, we have handlers bound to keyIn and itemSelect events. Lets try to understand the roles of these two handlers.

          Smart Autocomplete plugin stores the context data on an event using a special smartAutocompleteData object. Default actions use these context data on its execution. For example, default keyIn action, gets the query parameter from the smartAutocompleteData object of the event.

          To implement multi-value select, we need to set the last phrase entered (text after the last comma) to the query parameter, which is passed to default keyIn action. The custom handler we've defined for keyIn event does that by overriding the smartAutocompleteData object's query field.

          The itemSelect handler we defined, will concatenate the selected item to the field. However, it also executes ev.preventDefault() to skip the default action from running.

          Apart from the two events we've used in the example, there are set of other events available to use. You can find the complete list in the README.

          Digging Deeper

          In this post, I only highlighted the most important features of the Smart Autocomplete plugin. You can learn more about plugin's capabilities by reading the README and also the Specs (which is available in specs/core/ directory of the plugin)

          As always, I encourage you to fork the plugin's code from the Github repo and modify it for your requirements. If you feel your changes can improve the plugin, please feel free to send a pull request via GitHub.

          Also, if you run into any issues in using the plugin, please report them to GitHub issues.

          ]]>
          A Month in Vienna http://laktek.com/2011/06/08/a-month-in-vienna http://laktek.com/2011/06/08/a-month-in-vienna/#comments Tue, 07 Jun 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/06/08/a-month-in-vienna I'm writing this post from 30,000ft above the ground, flying back home from Vienna.

          I came to Vienna last month, as per an invitation from GENTICS Crew (the company behind awesome Aloha Editor) to hack on a really interesting project. I will not reveal much details about the project for now, since it's still little too premature for that (don't worry, you will get to hear lot about it in near future).

          In this post, I like to share how it was to live in the world's most livable city for one month.

          From the moment I landed in Vienna, I was impressed about the convienence and reliabilty of the public transport. Trams (Street Cars), Subway Trains and Buses operates normally till midnight and they are so frequent. Once you get used to the subway system moving around the city is piece of cake. Entrances to underground stations are clearly visible from distance and prominent color codes are used for each underground line. Also, you can travel on any public transport service with a single ticket (which would be valid for a day, week or month). Riding in public transport is also very comfortable, there's no rush even during the peak hours.

          Vienna is a master piece of great architecture and town planning. It's amazing how they have managed to preserve the traditional architectural styles throughout the city. Not only the chapels, palaces, museums and theaters, but every building in the city has its own glory.

          There are lush green parks and pathways throughout the city. Walking through these pathways in an evening or sitting in a bench to read on a weekend, can be the best luxurious you can experience in your life. Another interesting fact about Vienna is, you can drink the tap water. It is so pure, as it comes fresh out of mountains.

          You can't talk about Vienna without talking about its food. You have to experience the Viennese Schnitzel, Tafelspite, Melange (Viennese Cappuccino) and the great range of wines. The ice-creams and chocolates in Vienna are simply overwhelming for someone like me, who have a sweet tooth.

          Most of the people I met in Vienna were extremely smart and broad-minded. I guess they have inherited this from the knowledge driven culture and appreciation of arts. This free culture allows anyone to live respectably, regardless of their gender, religion, ethnicity, profession or sexual orientation. The city is so safe and I didn't see a single news on crime during my stay.

          Also, it's interesting to note most of the people in Vienna are obessed with reading. In public transport, almost everyone got a paper, book or kindle in their hands.

          As I see Vienna has become an immaculate city, not because of its past glory, wealth or the technology, but from the great discipline you find in the people. People in Vienna are so disciplined in the way they work, travel and even have fun. When you do your due responsibly, it not only raises your own living condition, but also of others.

          I would like to wrap up this post with an interesting quote from my caretaker. He used to say this every time he served the breakfast - "Cooked with love and served with charm!". That's how I cherish Vienna.

          ]]>
          Creating asynchronous web services with Goliath http://laktek.com/2011/08/24/creating-asynchronous-web-services-with-goliath http://laktek.com/2011/08/24/creating-asynchronous-web-services-with-goliath/#comments Tue, 23 Aug 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/08/24/creating-asynchronous-web-services-with-goliath Recently, I've been working on improving the performance of CurdBee API. There were certain highly consumable end-points which also had tight coupling to external resources. I wanted to extract these endpoints out of the main app to cut the cruft and improve the throughput.

          This require to turn them into bare-metal services which can utilize asynchronous processing. Weighing on the amount of code we can reuse, it was better to stick with a Ruby implementation rather than switching to a specialized evented library such as Node.js. However, implementing something like this in a Ruby is a challenge, because Rack interface itself is not designed to be asynchronous.

          Luckily, there are couple of ways to solve this problem. The most convincing solution I found, was to use the Goliath framework from PostRank labs. It implements a fully asynchronous stack, which includes a web server and a Rack-compatible API. Goliath hides the complexity of asynchronous processing from the developer. With Goliath, you can continue to write your code in the traditional top-down flow avoiding "callback spagetti".

          Goliath's Magic Secret

          Goliath serves to requests using a EventMachine loop. For each request, Goliath creates a Fiber, a continuation method introduced in Ruby 1.9. A Fiber is paused or resumed by EventMachine callbacks on IO operations.

          Goliath handles all this by itself without needing the developer involvement. Which means developers are free to write code following the traditional top-down flow.

          Goliath also implements a API closely related to Rack and ships common Rack middlewares modified for asynchronous processing. So from the outset, writing a Goliath app is very similar to writing any other Rack apps.

          Writing a Goliath app

          Since Goliath depends on Fibers, you will require Ruby 1.9+ to write and deploy Goliath apps. Once you have setup Ruby 1.9 in your system, you can use gem install goliath to get Goliath.

          If you have used other Rack Frameworks like Sinatra before, grasping Goliath's conventions would be very easy. A simple Goliath app is a ruby file with a class extending from Goliath::API. As in Sinatra, the file name should be the lower-case class name (for example, if your class name is MyApi then your file should be saved as my_api.rb).

          Goliath ships bunch of common & frequently used middleware, re-written in asynchrnous manner. If you want to use any common middleware, make sure you use the Goliath equivalent. Also, if you want to write any custom middleware for your app, check Goliath documentation for guidance.

          Your endpoint class extending from Goliath::API must implement a method named response, which should return an array consisting the status, headers and body (similar to a response in Rack).

          Simple Goliath API implementation, will look like this:

          require "rubygems"
          require "bundler/setup" #using bundler for dependencies
          
          require 'goliath'
          require 'mysql2'
          
          class MyApi < Goliath::API
            use Goliath::Rack::Params             # parse query & body params
            use Goliath::Rack::Formatters::JSON   # JSON output formatter
            use Goliath::Rack::Render             # auto-negotiate response format
          
            def response(env)
              #check the auth key
              if(!params.include?('auth_key') || params['auth_key'] != auth_key)
                [401, {}, "Unauthorized"]
              else
                response = {"total": 0} 
          
                [200, {}, response] #since we use the JSON formatter middleware, output will be formatted as JSON
               end
            end
          
          end

          Goliath supports the convention of multiple environments and provides a simple mechanism for environment specific configurations. You will have to create a config directory in app path and inside it you should create a file with the same name as your Goliath API, which defines the configurations.

          Very important thing you should always keep in mind is Goliath use a Fiber to process a request. Unlike Threads, Fibers are not preempted by a scheduler, which means a Fiber gets to run as long as it wants to. So if you use any blocking IO calls it will lock the process, defeating the whole purpose of using Goliath. When you are picking IO libraries make sure they are written in asynchronous fashion. Most common libraries do support asynchronous processing; for example Mysql2 gem supports asynchornous connections via EventMachine & Fibers.

          Here's how you can configure Mysql2 driver in Goliath to work in a non-blocking manner:

          require 'mysql2/em_fiber'
          
          environment :development do
            config['db'] = EM::Synchrony::ConnectionPool.new(:size => 20) do
                                                ::Mysql2::EM::Fiber::Client.new(:host => 'localhost', 
                                                :username => 'root',
                                                :socket => nil,
                                                :database => 'myapi_db',
                                                :reconnect => true)
                           end
          end

          On a database query, Goliath will pause the Fiber allowing other requests to be processed and will resume when it gets the results.

          Deploying with Capistrano

          We are using Capistrano to deploy Goliath apps to production. Process is similar to deploying other Rack apps and we use the railsless-deploy gem to avoid Rails specific conventions in Capistrano. Here's how our Capfile and deploy recipe look like:

          Capfile:

          require 'rubygems'
          require 'railsless-deploy'
          load    'config/deploy'

          deploy.rb:

          require 'capistrano/ext/multistage'
          
          set :stages, %w(staging production)
          set :default_stage, "production"
          
          # bundler bootstrap
          require 'bundler/capistrano'

          deploy/production.rb:

          
          #############################################################
          #    Application
          #############################################################
          
          set :application, "myapi"
          set :deploy_to, "/home/app_user/apps/myapi"
          
          #############################################################
          #    Settings
          #############################################################
          
          default_run_options[:pty] = true
          ssh_options[:forward_agent] = true
          set :use_sudo, true 
          set :scm_verbose, true
          set :keep_releases, 3 unless exists?(:keep_releases)
          
          #############################################################
          #    Servers
          #############################################################
          
          set :user, "app_user"
          set :domain, "0.0.0.0"
          set :password, "app_pwd"
          
          server domain, :app, :web
          role :db, domain, :primary => true
          
          #############################################################
          #    Git
          #############################################################
          
          set :scm, :git
          set :deploy_via,    :remote_cache
          set :branch, "master"
          set :repository, "ssh://git@repo_server/repos/myapp.git"
          
          namespace :deploy do
          
            desc "Restart the app after symlinking"
            task :after_symlink do
              try_sudo "god restart myapi"
            end
          
          end

          Monitoring Goliath with God

          Goliath is also a standalone app server, so you don't need any other Ruby app server to run Goliath apps. However, we use God to monitor the Goliath process for memory leaks and automatically restart on failure. Here is the God config we are using for Goliath:

          app_path = '/home/app_user/apps/myapi/current'
          app_env = 'prod'
          ruby_path = '/usr/bin/ruby' #change this if you are using RVM
          
          God.watch do |w|
            # script that needs to be run to start, stop and restart
            w.name          = "my_api" 
            w.interval      = 60.seconds
          
            w.start         = "cd #{app_path} && #{ruby_path} my_api.rb -e #{app_env} -p 9201 -d" 
          
            # QUIT gracefully shuts down workers
            w.stop = "kill -QUIT `cat #{app_path}/goliath.pid`"
          
            w.restart = "#{w.stop} && #{w.start}"
          
            w.start_grace   = 20.seconds
            w.pid_file      = "#{app_path}/goliath.pid" 
          
            w.uid = 'app_user'
            w.gid = 'app_user'
          
            w.behavior(:clean_pid_file)
          
            w.start_if do |start|
              start.condition(:process_running) do |c|
                c.interval = 60.seconds
                c.running = false
              end
            end
          
            w.restart_if do |restart|
              restart.condition(:memory_usage) do |c|
                c.above = 300.megabytes
                  c.times = [3, 5]
                end
          
              restart.condition(:cpu_usage) do |c|
                c.above = 50.percent
                c.times = 5
              end
            end
          
            w.lifecycle do |on|
              on.condition(:flapping) do |c|
                c.to_state = [:start, :restart]
                c.times = 5
                c.within = 5.minute
                c.transition = :unmonitored
                c.retry_in = 10.minutes
                c.retry_times = 5
                c.retry_within = 2.hours
              end
            end
          end

          Proxying with Nginx

          Finally, there should be a way to forward HTTP requests Goliath process(s). This can be done by setting up a simple Nginx proxy.

          upstream myapi { 
           server localhost:9201; 
          } 
          
          server { 
           listen 80; 
           server_name myapi.myapp.com; 
          
           location / { 
             proxy_pass  http://myapi; 
           }

          Hope this information helps you to get started with Goliath, for further details please check the Goliath website.

          ]]>
          Thank You, Steve! http://laktek.com/2011/10/06/thank-you-steve http://laktek.com/2011/10/06/thank-you-steve/#comments Wed, 05 Oct 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/10/06/thank-you-steve It's still early morning here in Sri Lanka; I was just starting the day passively glancing over my Twitter feed. In a stream filled with rants over iPhone 4S' form factor and predications on how Siri is changing the world, seeing this tweet just shocked me. It feels like the passing of someone closer to my life. Yes, Steve was terminally ill, but who'd have thought it will come to him so soon?

          I admire Steve Jobs not because of Apple, but for the constant inspiration he gave throughout in his life. As a kid, seeing what Bill Gates has achieved, I believed you need to be an extraordinaire gifted with talent and luck to become successful and change the world.

          However, listening to Steve's Standford graduation speech changed my perspective. I started to believe anyone can change their destiny, if they got the true passion and perseverance.

          Steve's life was not bed of roses. As a child he was adopted, he had to drop-out from the University, was kicked out from his own company and lived the best years of his life battling with Pancreatic cancer. Yet he managed to "put a ding in the universe".

          Thank you Steve for all the inspiration! May your soul rest in peace!

          P.S. - Please Consider donating to Pancreatic Cancer Action Network, to fight against the disease that took Steve's life.

          ]]>
          Why and How I Revamped My Blog http://laktek.com/2011/11/17/why-and-how-i-revamped-my-blog http://laktek.com/2011/11/17/why-and-how-i-revamped-my-blog/#comments Wed, 16 Nov 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/11/17/why-and-how-i-revamped-my-blog Update: I've migrated this blog from Jekyll to Punch. Here's the detailed blog post.

          If you are a regular visitor of my blog you will notice this has undergone a significant revamp (if you are new here, this looked similar to any other Wordpress blog). Actually, I've been working on this revamp for quite a some time and must say I'm really satisfied with the final outcome.

          There were several main intentions behind this revamp. First one was to make it pleasurable to read on mobile. I tend to read a lot in mobile, but 99% of the sites I visit render as crap on mobile. Thanks to Readability bookmarklet or Instapaper I get to read them on mobile, but with limitations. With these services you loose the the engagement with the original site. Also, mobile readers tend to drop important bit of information such as code blocks, images and tables in the process of scraping. I wanted to make sure at least my blog can be read on mobile without using any additional steps or tools.

          When I originally started this blog 5 years ago, it was powered by Wordpress and hosted on Dreamhost. This was pretty much the preferred setup during that time, but over the years it became too troublesome to maintain. Wordpress became constant target of attackers and there were important security releases almost every day. If you miss one, you are busted! Once I nearly got removed from the Google index thanks to such an attack. In Wordpress, even a simple template customization means you are diving in a pool of spaghetti soup. Apart from that, shared hosts like Dreamhost have become slow, inconvenient and simply worthless for what you pay. This setup was driving me crazy and wanted to get rid of it.

          Mobile First Design

          Having worked a lot in backend and client-side scripting, this was my first attempt in designing an entire site with HTML5 and CSS3. Actually, thanks to the modern browser support frontend designing has become more interesting and fun. I actually managed to mock this layout entirely in HTML without even touching an image editor. Beauty of HTML mocks is they are interactive and you will always know how it will render in the browser.

          Since I want to make this optimized for mobiles, I started with a mobile first design. The process was immensely simplified thanks to 'HTML5 Boilerplate' and '320 and Up' responsive stylesheets. Mobile first design expects you to identify the most important elements of the UI and then extend them for the larger screens.

          I used the Voltaire as the typeface for main title and hand-drawn Shadows Into Light was used as the typeface for sub title. Both of these fonts were freely available and was embedded using Google Fonts. Header background was spiced up with a CSS3 gradient (generated from here).

          All icons used in this design are in SVG format. Advantage of using SVGs is they can be scaled according to the viewport. I found this really cool, especially when dealing with multiple resolutions. To get SVGs work across all browsers I load them via RaphaelJS. All icons used here are from The Noun Project collection and Raphel icons.

          All posts in this blog follows the semantical guideline provided by Readability. It is based on the HTML5 elements and hNews microformat.

          Homepage spreads to 3-columns in large screens

          Powered by Jekyll

          I prefer to use Vim for all my writing (from code to emails). I love how it let me keep the focus without getting into the flow. Also, I feel comfortable using Markdown for formatting than a WYSIWYG editor. Earlier, I used to draft my posts this way and paste them in Wordpress; but it would have been awesome if I could publish them directly from the command-line. Jekyll, a static site generator by Tom Preston-Werner does exactly that.

          Since Jekyll is based on Ruby and uses Liquid templating system, customizing and extending is also very easy. However, I didn't need to do much customizations to convert my existing Wordpress blog to Jekyll.

          Hosted in Amazon S3

          As I mentioned before, I was fed up with dealing with shared hosts, yet I couldn't make up my mind to go for a VPS just to host a blog. With Jekyll, there's no server-side logic involved. The site is generated locally and what I needed was a place to host the HTML files and other static resources. Amazon S3 fits for this purpose beautifully. Converting any S3 bucket to serve a website is easy as ticking a box in AWS console (you can read more details on AWS blog).

          I had to map the domain laktek.com to S3 endpoint with a CNAME record. One downside of this is you will loose the ability to maintain email addresses from this domain (a better way would have been to use a subdomain for the blog, but I had all my link juice from this domain).

          Currently, I use the Jekyll-S3 gem to push the generated site to the S3 bucket. Stil it doesn't have a mechanism to push only the updated files, but I'm hoping to have a better syncing strategy in future.

          Handling Dynamic Pages

          In Wordpress there are dynamic index pages by dates (/2011/10) or tags (/tag/code). To have this behavior in Jekyll it seemed I need to generate static pages for each date or tag. I didn't like this idea, as it could make the building and deploying process even slower with all the additional pages. I was able to come up with a small hack to handle the dynamic pages.

          In Amazon S3, you can specify an error page to display when it cannot find the given resource. I'm using the archive page as the error endpoint, which renders a list of all the posts. Then with the help of a simple JS script, I filter the page to match the requested URL.

          To get a better idea of what's happening here, try visiting following pages - http://laktek.com/2009 or http://laktek.com/tag/code.

          Managing it from the Cloud

          Most of the time, I do work and publish my blog posts from my local machine. But that doesn't mean I can use only the local machine to build my blog. I have a setup, which enables me to build it from any machine.

          I keep the templates and generator in a git repository and push the changes to GiHub (Yes, you can reuse the source, but don't just rip the site entirely). All the posts and sensitive configuration details are stored in Dropbox. If I can access these resources I can build my blog from any machine. In future, I'm planning to offload the entire build process to cloud by hiring Amazon EC2 Spot instances.

          Also, I should mention about Nocs, a free iPhone app which I use to edit the posts in Dropbox with Markdown syntax. This is really convenient way to do quick edits to posts and jot down new ideas for future posts.

          ]]>
          Basic Patterns for Everyday Programming http://laktek.com/2011/11/23/basic-patterns-for-everyday-programming http://laktek.com/2011/11/23/basic-patterns-for-everyday-programming/#comments Tue, 22 Nov 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/11/23/basic-patterns-for-everyday-programming For most of you the patterns mentioned below should be nothing new. These are very basic stuff we slap into our code everyday and at times feels they are actually code smells than smart patterns. However, I've been doing some code reviewing lately and came across many code that lacks even these basic traits. So I thought of writing them down as a help for novice developers who would want to get a better grasp at these.

          These patterns are commonly applicable in most general purpose programming languages, with slight syntactical changes. I use Ruby and JavaScript for the examples in this post.

          Verify object's availability before calling its methods or properties

          In ideal world we expect every call to an object will return that object, but in real world either object or null is returned. If we try to invoke a method without knowing it could be a null object, exceptions will be raised and at worse unexpected results would be returned.

          The simple way around for this is to verify object's availability before calling its methods or properties. We connect the object and its method (or property) call with a logical AND operator (&&), so the method is only invoked if the object is truthy (not null). This technique is commonly known as 'andand'.

          Here's a real-life example in JavaScript, where we use the native JSON object to parse a string:

            var parsed_content = window.JSON && window.JSON.parse("{}");

          If the native JSON object is not present in the window context parsed_content will be set to null undefined.

          Some languages has built-in shorthand methods for this pattern. If you are using Ruby 1.9 Ruby on Rails (2.3+) framework, Object.try serves for the same purpose. Which means:

            @person && @person.name

          can be written as:

            @person.try(:name)

          Set a default value with assignments

          When we want to assign a value for a variable, the value returned could actually be null or undefined. On such a instances it's better to assign a default value. This minimizes the surprises later in the code and simplifies the conditional logic involving that variable.

          To assign a default value in a single assignment statement, we can use logical OR operator (||), which assigns the latter value if former is falsy.

          Here's a simple example in Ruby:

            @role = @person.role || "guest"

          Gotcha: Be aware of the contexts where your variable could take a boolean value. Default value will be returned even when expected value is legitimately false.

          Checking whether a variable equals to any of the given values

          Imagine an instance where you have to perform an action if the current_day is either Monday, Wednesday or Friday. How would you check that condition?

          I've seen many write this as:

            if(current_day == "Monday" || current_day == "Wednesday" || current_day == "Friday") 
              # perform action
            end

          It does the job, but as you see it's verbose. What happens if we have to mix another condition with this in future (eg. also check whether the calendar date is above 20)?

          A better way to do this is collecting the given values to an Array and checking against it. Here's the modified code:

            if(["Monday", "Wednesday", "Friday"].include?(current_day)) 
              # perform action
            end

          Same example can be written in JavaScript like this:

            if(["Monday", "Wednesday", "Friday"].indexOf(current_day) >= 0){
              // perform action
            }

          Extract complex or repeated logic into functions

          When you have long, complex logic in condition or assignment statements, extract them into functions. It improves the code clarity and also makes refactoring lot easier.

          Here's a slightly modified version of previous example:

            if(["Monday", "Wednesday", "Friday"].include?(current_day) && (current_date > 20)) 
              # perform action
            end

          We can extract this logic into a function and call it like this:

            def discount_day?
              ["Monday", "Wednesday", "Friday"].include?(current_day) && (current_date > 20)
            end
          
            ...
          
            if(discount_day?) 
              # perform action
            end

          This refactoring allows others to read the code in context to the domain, without having to comprehend the internal logic.

          Doing similar refactoring should be possible in every language.

          Memoize the results of repeated function calls

          Another advantage of extracting logic into functions is you can memoize the result of calculation, if it's going to be called repeatedly.

          Here's how simple memoization works in Ruby.

            def discount_day?
              @discount_day ||= (["Monday", "Wednesday", "Friday"].include?(current_day) && (current_date > 20))
            end

          Let's try to understand what happens here. At the first call, @discount_day instance variable is undefined; hence the assignment block is evaluated and its result is assigned to @discount_day. But on the next call, since the @discount_day is already holds a value its value is returned without evaluating the assignment block.

          Let's see how to do similar in JavaScript:

            var discount_day;
            function discount_day(){
              if(typeof discount_day === "undefined"){
                discount_day = (["Monday", "Wednesday", "Friday"].indexOf(current_day) >= 0 && (current_date > 20))
              }
              return discount_day
            }

          Each language may have their own way doing memoization, refer to your language and take the advantage of it.

          ]]>
          After Graduation http://laktek.com/2011/12/15/after-graduation http://laktek.com/2011/12/15/after-graduation/#comments Wed, 14 Dec 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/12/15/after-graduation One year after graduation, I hear many of my friends already reminiscing the days as an undergrad and about the freedom we had that time to live our lives the way we want. Yes, it was the time we never feared to put overselves against extreme challenges, found love at hopeless places and fought to change the world!

          But do we really loose this freedom to live our lives the way we want as we graduate?

          As I see, the following 3 decisions you take after graduation will define everything about you and your future.

          1. What you do to live?
          2. Where do you live?
          3. With whom you live?

          Everyone will not have the same options and conditions when it comes to making these decisions. So you cannot simply compare how smart or correct one's decisions to another.

          Most important thing is to be conscious about yourself when making these decisions. Never try to go with the flow or let others to make these decisions for you. Then at some point in life when you experience the consequences (both bitter and sweet), you know those are results of your own decision making.

          Keep control of your life and live the way you want!

          ]]>
          Sugarless - A Functional & Context Oriented way to write JavaScript http://laktek.com/2011/12/21/introducing-sugarless http://laktek.com/2011/12/21/introducing-sugarless/#comments Tue, 20 Dec 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/12/21/introducing-sugarless Fundamentally, JavaScript is a functional programming language. It's built with the concepts of higher-order functions, closures and lazy evaluations. It also has objects with protoypal inheritance. Unlike in Object Oriented languages, JavaScript's objects don't have methods. Instead, JavaScript executes functions in context of objects. When you call this inside a function it returns the reference to its current context.

          Try running following expressions in your browser's console.

            var testFunc = function(){ return this; }
            testFunc()
          
            //returns reference to window object
          
            var obj = new Object();
            obj.testFunc = testFunc;
            obj.testFunc()
          
            // returns reference to obj

          As you can see same function referenced to two different this values on the two occasions. Moreover, we can explicitly set the this value (or the context) when we invoke a function.

            var testFunc = function(){ return this; }
            var obj = new Object();
          
            testFunc.call(obj);
          
            // returns reference to obj

          I think most of us finds JavaScript confusing because we tend to think from a Object Oriented mindset. It's more pragmatic to think JavaScript from functions and contexts. However, lack of expressivity in the language itself limits this line of thought.

          What is Sugarless?

          Sugarless is a more expressive way to write functional and context-oriented programs in JavaScript.

          Imagine a case where you have an input which has to undergo certain operations to produce a meaningful output. We define each operation as a function. The common way to do this in JavaScript would be:

            var output = truncate(trim(sanitize(input)), 200)

          At a glance, it's not very easy to comprehend. It gets further complicated if we try to add more functions or remove a certain function.

          If we think in terms of Object Oriented concepts we can try to make this more readable with a fluent interface (ie. method chaining).

            var output = input.sanitize()
                              .trim()
                              .truncate(200);

          In order to do this in JavaScript, we must make sure the functions are available in the input object's prototype chain, object itself is mutable and all functions, apart from the last function returns the input object. Though this can be done, it's doesn't feel very natural or flexible.

          Using Sugarless, this is how we can write it in a more readable and flexible manner:

            var output = Sugarless(input)(
               sanitize          
             , trim             
             , truncate, "", 200   
            );

          Sugarless will invoke each function with the this value set to input (context) and first argument set to the return value of the previous function. Only thing we need change in functions is to use the this value as the input when no argument is passed (you can find the full example here).

          What happens under the hood?

          It's not hard to understand the logic behind Sugarless. When you call Sugarless with an object, you are creating a new context. Then you can define functions (and arguments) to run under this context.

            Sugarless(context_object)(
              function(){ }, arg1,.., argN 
            , function(){ }
            ...
            ...
            , function(){}
            );

          In pure JavaScript terms, Sugarless(obj) is a function which returns a function. The returned function accepts any number of arguments (first argument should always be a function). Sugarless will invoke the functions passed with this value set to the context object(obj). Also, if you defined any non-function values after a function they are passed as arguments to the function. However, if the previous function in the chain returns a value, it will override the first argument of the current function.

          Sugarless' Powers

          Along with the context and function chaining Sugarless gives you a bunch of nifty features to organize your code better. I'll highlight some of the most interesting ones here. To learn about all features available please refer to the README.

          Sugarless provides a mechanism to do deferred execution. By default, all functions in the chain will execute sequentially. However, if you call sugarless.next() inside one of the functions, it will halt the chain and return the next function to be executed. If you are making an asynchronous call, you can pass the next function as a callback and resume the chain when you receive the results.

              Sugarless(obj)(
                 function() { setTimeout(Sugarless(this).next(), 60) }
               , function() { console.log("second function") }
               , function() { console.log("third function") }
              );

          You can provide optional before and after callbacks to a context, which will be invoked before and after the chain respectively. This can be really useful if you want to create wrappers around Sugarless.

              Sugarless('{"name": "John"}', {
                  before: function(){ return JSON.parse(this); }
                , after:  function(obj){ return JSON.stringify(obj); }
              })(
                   function(obj) { obj.profession = "Programmer"; return obj; }
                 , function(obj) { obj.favorite_food = "Pizza"; return obj; }
              );

          Sugarless make sure not to invoke any functions when the context is null or undefined. Instead it simply return a null as the result. This behavior is somewhat similar to Null Object Pattern you find in Object Oriented programming. Further, you have the option to specify a fallback context in the event of a null context.

          Here's an example of providing a polyfill for navigator.geolocation for the browsers that doesn't implement it (you can see the full example here).

            var customGeolocation = {};
          
            Sugarless( navigator.geolocation, { 
                fallback: customGeolocation
            })(
              function(){}
            , function(){}
            );

          Bottom-up Programming

          Using Sugarless will encourage you to build your solution by separating behavior into focused and untangled functions. For example, here's how we have defined the initial contexts in the Todo List example:

                $("ul#pending_todos")(
                  Store.fetch
                , List.add
                );
          
                $("form#new_entry")(
                  onSubmit, [ Form.captureInput, Task.save, Form.reset ] 
                );

          We instruct to fetch tasks from the datastore and populate the pending todos list. For the new entry form, we have defined certain behaviors to invoke when a user hits submit. Note how each function focus only on doing one small task and how context holds them together perform the bigger task. This approach makes it easier to understand the behaviors and even allows you to refactor a particular behavior without affecting others.

          Since these functions takes the context in to consideration, reusability also becomes very easy. For example, later we also wanted to show a list of completed todos apart from the pending todos. It behaves very similar to the pending todos list other than having a strikethrough in its text.

              $("#completed_todos")(
                  Store.fetch
                , List.add
              );

          We reused the same Store.fetch and List.add functions to populate the completed todos list as well. From the context, the functions know which todos to fetch and which list to add those.

          For the brevity of this post, I'm not going to explain the full implementation here. I suggest you to check the source, which is pretty much self-explanatory.

          Give it a try...

          This is just the initial public release of Sugarless. There can be bugs, ambiguous parts and ton of possible improvements. So I appreciate all your feedback & contributions to make it better. Best way to start is to write some code with it.

          You can install it from NPM by running npm install sugarless or you can grab the source from GitHub. If you want to learn more, check the examples and also read the spec.

          Personally, Sugarless gave me a better grip of JavaScript and I hope it will feel the same for you too. However, I don't expect it to be everyone's cup of tea either. Choice of tools largely depends on the personal taste and experience :)

          Last but not least, I should thank Nuwan Sameera for being my sidekick in this project & rest of the Vesess Team for their invaluable feedback and support.

          ]]>
          Reviewing My 2011 http://laktek.com/2011/12/30/reviewing-my-2011 http://laktek.com/2011/12/30/reviewing-my-2011/#comments Thu, 29 Dec 2011 16:00:00 GMT Lakshan Perera http://laktek.com/2011/12/30/reviewing-my-2011 It's almost the end of 2011. Now it's a good time to review how things went in last 12 months and make some resolutions to the next 12 months.

          From the start, I expected this to be a challenging year. This year, I turned 25 and was working full-time (no more formal education). So it was like moving on to a new chapter in life. From the beginning, I had few personal goals for the year. Here's a summary of how I progressed with them during the year.

          Blogging

          Blog Traffic Graph

          I've been running this blog since 2006, but it was only this year it actually broke out of ruts. Traffic wise still this is nothing spectacular, but I like how it evolved with my experience.

          This year, I didn't post frequently, but when I did I tried to come up with something interesting to read. I enjoy blogging because it is a great way to learn. From gathering points to a post, presenting it to suit the audience and discussions that takes place afterwards helps to expand my knowledge on the topic.

          Also, I redesigned the blog and switched to a new platform during this year.

          Fitness

          Fitness Graph

          Every time I started doing workouts in the past it didn't last for more than couple of days. Either due to exhaustion or lack of motivation I simply didn't continue. But in last June I decided to give it another try; this time with a much simpler and relaxed schedule (based on couch to 5k plan). So far I've been able to stick to the schedule and increment the flow gradually (dip in October was due to bad weather). I should mention RunKeeper, which provided a great assistance by tracking my progress and motivating me to stick to the schedule.

          I was never an athletic person, thats why I'm so delighted with this progress!

          Spending

          Expense Graph

          I believe the best way to live a happy life is not earning more, but spending less. Spending less doesn't mean living a meager life. It's about knowing when and for what to spend. When we launched the Expense Tracking for CurdBee in last July, I started using it to track and analyze my own expenses.

          Soon I was able to apply some tweaks to my life which not only helped me to reduce expenses but also have some long term benefits. I made a major saving by relying more on public transport and walking instead of spending on fuel (walking is a more efficient and sane option in Colombo, where roads are always congested). Also, I tried to sticking only to 3 home cooked meals, instead of eating out or having junk food as fillers. When it comes to buying stuff, I avoided making impulsive decisions and concerned more on the quality than the price. Paying more for the quality ensured less troubles and long term use.

          What are my goals for 2012? I shall continue on the starts I got in 2011. I want to keep raising my bar.

          ]]>
          Learning Go http://laktek.com/2012/01/05/learning-go http://laktek.com/2012/01/05/learning-go/#comments Wed, 04 Jan 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/01/05/learning-go This year I'm going to try a new programming language - Go. I had this notion that compiled, type based languages are overly complex and reduces developer efficiency. However, after doing some reading about Go, it appeared to take a different path from the rest and felt like something worth trying.

          Acquainting a programming language is a journey. First few steps you take with it will define your perception about it. These first few steps went well for me with Go and it felt like a good fit for my repertoire. I thought of sharing my learning experience, hoping it will help others who want to learn Go.

          Getting the start

          Installation of Go involves some steps, but it's well documented. If you followed the steps correctly, you should have the Go compiler installed without any issue.

          As recommended in the official site, I started learning the basics with the in-browser tour of Go. It covers the essentials of Go with good examples. Once you are done with the tour, I recommend reading the Go Tutorial and Effective Go, which has a good coverage on how to write idiomatic Go code.

          Make sure you configure your text editor to support Go syntax highlighting before moving on to coding. Here's a comprehensive list with details on how to configure Go for various editors. Apart from that, Go comes with a command-line utility called gofmt, which standardizes the format of your Go files by doing things such as removing unnecessary brackets, applying proper indention and spacing. As recommended in this thread, I added the following lines to my .vimrc to invoke gofmt every time a file is written on Vim, so I always have well-formatted Go code.

            set rtp+=$GOROOT/misc/vim
            autocmd BufWritePost *.go :silent Fmt

          Reference

          Being a new language (and terribly named), Google search isn't that much of a help when coding Go. I recommend to keep the language spec at you disposal and always refer to it when in doubt. I hope more resources would be available as the language gains more popularity.

          One of the best features of Go is its rich standard library(or default packages). You will find default packages for most common tasks such as handling HTTP requests, image manipulation, cryptography and encoding/decoding JSON. Most of these packages contains good documentation and package documentation also links to source files of the package, making it easier to read the code.

          You will find lot of third-party packages written in Go from the package dashboard. I found reading third-party packages' source code is a good way to discover best practices and styles involved with Go. Let me point you to two such good packages to read - One is Google's Snappy compression format implemented in Go and the other one is a wrapper to GitHub's issues API written by Justin Nilly.

          Organizing Code

          In Go, you have to organize your code using packages. Every file in a Go program should belong to a package. During the compilation, compiler will combine all files with same package clause together. There's no requirement to keep the files of same package in the same directory. You have the flexibility to physically organize your files in a way you prefer. However, you can have only one package clause for a file.

          Inside a package you can declare variables, constants, types and functions. All these top level declarations will be scoped to the package. If you want expose an identifier to outside, you must start its name with a capital letter. In Go this is known as exporting and there's no concept of access modifiers, which's one thing less to worry when coding Go.

          To use a package in another place you must import it. Imports are scoped to the file, so if you use a package on different files they must all individually import it, even if they all belong to the same package.

          Here's an example of a typical Go file. Note the package clause, import and exporting of function identifier.

            package foo
          
            import "bar"
          
            var my_var = "baz"
          
            func Baz() string {
              return my_var 
            }

          Identifiers & Declarations

          Identifiers can only contain unicode letters, digits and underscore (_). This means identifiers such as awesome? is invalid, but you can have identifiers such as as µ (using unicode letters).

          In Go, you can declare multiple identifiers using expressions. Since functions can also return multiple values, we can have expressive statements like this:

            var x, y = getCoords()

          Imagine if you are only concerned about the y value in the above context and doesn't actually need x. You can assign the blank identifier(_) to the values you are not interested. So the above example can be modified to:

            var _, y = getCoords()

          You can also group declarations with brackets.

            var (
                  name string 
                  age = 20
            )

          Apart from using var to declare variables, there's also a short-hand form using :=. Here you can omit the type and let it to be deduced at the runtime from the expression. Short-hand form can only be used inside functions (and also in some cases as initializers for if, for and switch statements). You can also declare multiple variables with the short-hand expression. Another interesting thing here is you are allowed to redeclare variables in multi-variable short-hand declarations if there's at least one new variable in the declaration. Here's an example:

            func name() string {
              first_name := "John" 
              first_name, last_name := "Peter", "Pan"
          
              return first_name + " " + last_name 
          
            }

          Functions & Methods

          As I mentioned earlier, functions in Go can have multiple returns. In idiomatic Go, main purpose of multiple returns is for error handling. You define one parameter as the result and other parameter as the error. Caller of the function should check for the value of error parameter and handle if there's an error. Let me explain this with an example:

            func getFile() (file *File, error os.Error) {
              ...
            } 
          
            func process() string {
              file, error := getFile()
          
              if error != nil {
                print("Error occurred in retrieving the file.")
                return ""
              }
          
              // do something with the file
          
              return result
            }

          Every function in Go must always end with a return statement or a panic statement or have no return values. You cannot simply return inside a control statement such as if or switch (Update: As Jeff Wendling mentioned in comments you can do this, if you have a return statement also at the end). Some may call this a language feature that helps to the code more obvious, but the Go core team accepts this as a known issue: http://code.google.com/p/go/issues/detail?id=65

          Go do also have methods. But unlike in other languages where methods are bound to objects, in Go methods are bound to a base type (actually, Go doesn't have a concept of objects). Basically, a method is same as a function with a explicit receiver defined as the first argument. Note that you can define methods only for the types declared in the same package.

          Here's an example on how methods can be defined and called.

            type mystring string
          
            func (s mystring) capitalize() string {
              ...
            }
          
            func main() {
              var str mystring = "paul"
              print(str.capitalize())
            }

          String & Character Literals

          Similar to C, Go has both character literals and string literals. Character literals should be written inside single quotes ('a' or '\u12e4').

          String literals can be in two forms. One is known as the raw form, where you write the string inside back quotes (`abc`) and the other is known as the interpreted form, where the string is written inside double-quotes ("abc"). Strings in raw form can span multiple lines, while interpreted string must be in a single line. In raw form if you write `\n` it will be output as it is, whereas in interpreted form it will treated as in the context of character literals (escapes properly and creates a new line).

          Don't confuse interpreted-form to string interpolation you find in other languages. Closest thing to string interpolation in Go is fmt.Sprintf function.

            fmt.Sprintf("The value is %v", 15) //output: The value is 15

          Control Flow

          There are two ways to do control flow in Go - using if and switch statements. You can provide an initializer statement before the conditional expression. These expressions need not to be wrapped in parenthesies.

          Branches in if statements should always be written inside blocks(enclosed by {}). Go doesn't support ternary operator (?:) or single line if statements as in other languages. When writing an else statement it should always be in the same line with closing curly backet of the previous if branch.

            if a := getScore(); a > 500 && game_over == true {
              print("You have a high score!")
            } else {
              print("You score is low!")
            }

          Cases in switch statements has implicit break in Go. There's no default cascading through case statements (Update: As Jeff Wendling mentioned in the comments you can use fallthrough keyword to cascade through cases). If you want multiple cases to provide same behavior you may define them in a comma separated list. Also, in switch statements you can omit the expression if you want, such instances are evaluated as true.

            switch color {
              case "red": print("danger")
              case "green", "yellow", "blue": print("normal")
            }

          Compiling

          Go's compiler follows a no bullshit approach and quite adamant about the structure of your code. There's no such a thing as warnings in Go compiler. If it finds an issue it won't just compile. Importing packages and not using them, declaring variables and not using them will stop compiler from compiling your code. At the beginning, you will feel such nitpickings are a hindrance, but as you get used to it you will feel you are writing more clean code.

          In this post, I touched only the surface of Go programming. In future posts, I'm planning to dig deeper covering topics such as slices, interfaces, channels, goroutines and testing.

          ]]>
          Learning Go - Constants & Iota http://laktek.com/2012/01/12/learning-go-constants-iota http://laktek.com/2012/01/12/learning-go-constants-iota/#comments Wed, 11 Jan 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/01/12/learning-go-constants-iota When you learn Golang, your first encounter with constant declarations can be little confusing.

          type ByteSize float64
          const (
              _ = iota  // ignore first value by assigning to blank identifier
              KB ByteSize = 1<<(10*iota)
              MB
              GB
              TB
              PB
              EB
              ZB
              YB
          )

          Why only the first constant (KB) has a value assigned? What does iota means? These are some of the questions that could pop into your mind when you go through the above code sample for the first time. As I mentioned in the previous post, best way to find answers to such questions is to refer the Go spec.

          Constant Expressions

          In Go, constants can be numbers, string or boolean values. Basically, constant values can be represented by a literal(eg. "foo", 3.0, true) or by an expression(eg. 24 * 60 * 60). Since constants are initiated at the compile time, constant expressions should be composed only from constant operands.

          If a constant value is a literal or evaluated from an expression containing only untyped constant operands it is an untyped constant. If you want, you can explicitly specify a type for a constant at the time of declaration.

          const typed_size int64 = 1024
          const untyped_size = 1024
          
          var f float64 = typed_size // will give you a compile error
          var f float64 = untyped_size // valid assginment

          As shown in the above example, if you try to assign a typed constant to a variable of a different type it will give a compilation error. On the other hand, an untyped constant is implicitly converted to the type of variable at the assignment.

          Declaring Constants

          You can declare multiple constants in a single line as a comma-separated list. However, you should always have same number of identifiers and expressions. This means,

           const a, b, c = 1 //invalid assignment

          is invalid.

          You should write it in this way:

           const a, b, c = 1, 1, 1 //valid assignment

          If you feel this is too repetitive, you can try the parenthesized form of constant declaration. Most interesting property of parenthesized form is you can omit an expression to an identifier. If you do so, the previous expression will be repeated. So the above example can be re-written in parenthesized form like this:

           const(a = 1
                 b
                 c
                )

          Here constants b and c will also take the value 1.

          const (
              Yes = 1
              True
              No = Yes >> 1
              False
          )

          In the above example, constants Yes and True gets the value 1; while No and False gets 0. Also, note that we can use a previously defined constant in an expression to declare another constant.

          What is Iota?

          A practical usage of constants is to represent enumerated values. For example, we can define days of the week like this:

          const (
            Sunday = 0 
            Monday = 1
            Tuesday = 2
            Wednesday = 3
            Thursday = 4
            Friday = 5
            Saturday = 6
          )

          Iota is a handy shorthand available in Go that makes defining of enumerated constants easy. Using iota as an expression, above example can be re-written in Go as follows:

          const (
            Sunday = iota
            Monday
            Tuesday
            Wednesday
            Thursday
            Friday
            Saturday
          )

          Iota value is reset to 0 at every constant declaration (a statement starting with const) and within a constant declaration it is incremented after each line(ConstSpec). If you use iota in different expressions in the same line they will all get the same iota value.

          const(
           No, False, Off = iota, iota, iota // assigns 0
           Yes, True, On                    // assigns 1
          )

          Finally, here's a little tricky one. What would be the value of c?

          const (
              a = iota // a == 0
              b = 5    // b == 5
              c = iota // c == ?
          )

          In the above example, c will take the value 2. That's because value of iota is incremented at each line within a declaration. Eventhough, the 2nd line doesn't use the iota value it will still get incremented.

          ]]>
          Different flavors of JavaScript http://laktek.com/2012/01/19/different-flavors-of-javascript http://laktek.com/2012/01/19/different-flavors-of-javascript/#comments Wed, 18 Jan 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/01/19/different-flavors-of-javascript If you are programming with JavaScript, knowing about ES3, ES5 & Harmony specifications and their usage will be useful. Here's a plain & simple explanation of them for your easy understanding.

          ECMAScript

          If we look into the history of JavaScript, it was originated from a side project of Brendan Eich called "Mocha". In 1995, it was shipped with Netscape browser as "LiveScript" and it soon renamed as "JavaScript" (mainly from the influence of Sun Microsystems). Due to the quick popularity of JavaScript, Microsoft also decided to ship it with Internet Explorer in 1996. Microsoft's implementation had slight differences from the original, thus they aptly named it as "JScript".

          As browser wars between Netscape and Microsoft fired up, Netscape soon pushed JavaScript to Ecma International for standardization. Ecma accepted the proposal and began the standardization under the ECMA-262 standard. As a compromise for all organizations involved in the standardization process, ECMA-262 baptized this scripting language as "ECMAScript".

          Even though we still call it as JavaScript, the technically correct name is ECMAScript.

          ES3

          Over the years, Ecma has released different editions of ECMAScript standard. For the ease of use we call these standards as "ESx", where x refers to the edition. So the 3rd edition of ECMAScript is known as ES3. ES3 can be considered as the most widely adopted edition of ECMASCript.

          The most outdated browser in mainstream (aka Disease) Internet Explorer 6 is compliant with ES3. Sadly, other common IE versions(7 & 8) are also only compatible with ES3. Early versions of most other browsers also supported ES3. This means all JavaScript features you commonly use are part of ES3. Most JavaScript libraries, frameworks, tutorials, best practices and books written in the past are based on these features standardized in ES3.

          Source-to-source compilers such as CoffeeScript, which aims to run everywhere, compiles its code to be compatible with ES3.

          If you are interested in reading the full ECMAScript 3rd edition specification, you can download it from here.

          ES5

          After years of split and conflict of interests ECMA's Technical Committee came to an agreement in 2008 to follow two paths for the future development of ECMAScript. One as an immediate plan to overcome the issues ES3 specification (then called as ES3.1). Another with a long term vision to evolve the language for the modern requirements. They also decided to kill ES4 specification, which was under development to support the above plans.

          The ES3.1 edition was finally released as ES5 in 2009. Some of the notable features in this edition were Native JSON support, better Object creation and controlling options, utility Array functions and the introduction of strict mode to discourage the usage of unsafe and poorly implemented language features. You can read a detailed introduction about ES5 features in Opera blog.

          Full support for ES5 in major browsers were introduced from the following versions - Firefox 4, Chrome 7, Internet Explorer 9 and Opera 11.60. Safari 5.1 (and mobile Safari) in iOS5 do support all of ES5 features except for Function.bind. Also, IE9 doesn't support the strict mode option. Juriy Zaytsev provides a comprehensive compatibility table of ES5 features, which I recommend you to bookmark.

          So is it safe to use ES5 features in our JavaScript code? Answer to that largely depends on your user base. If majority of your users comes from Internet Explorer 6, 7 & 8, code with ES5 features will break for them. One way to solve this problem is to use ES5 shims for unsupported browsers. You may decide which shims to include depending on the features you use in your code. Also, if your code is already depending on a utility library such as Underscore.js, which also provides similar features to ES5 you may continue to use it. Most utility libraries will use the native implementation if available, before falling back to its own implementation.

          If you are writing server-side JavaScript based on Node.js you can freely use ES5 features. Node.js is based on the V8 JavaScript engine, which is fully compatible with ES5. Another thing to consider is whether you should write your server-side JavaScript using CoffeeScript. If you are doing so, you are limiting your ability to use native ES5 features. As I mentioned earlier CoffeeScript compiles only to ES3 compatible JavaScript and has custom implementations for ES5 features such as bind. However, this is still an open issue with discussions, suggesting CoffeeScript may add the option to compile ES5 compatible code in future.

          For the full reference of ES5, I recommend using the annotated HTML version done by Michael Smith - es5.github.com

          ES.Next (Harmony)

          The long-term plan for the ECMAScript in 2008 meeting, was code-named Harmony. Committee is still accepting proposals for this edition. Most probably, this will become the ES6, but given the past track-record of ECMA-262 ES.Next would be more suitable name until a release is made.

          Currently planned features for Harmony sounds promising. Brendan Eich has shared some ideas for Harmony which seems to make the language more concise and poweful. Also, his presentation on Proxy Objects in Harmony sounds awesome.

          SpiderMonkey and V8 JavaScript engines has already started implementing some of the Harmony related features, such as Proxies and WeakMaps. It would be still premature to use these features at the client-side (in Chromium browser you need to explicitly enable Harmony features via a special flag). Node.js 0.7, will ship with v8 version 3.8 giving you the opportunity to tase some of the Harmony features in server-side.

          ]]>
          Learning Go - Types http://laktek.com/2012/01/27/learning-go-types http://laktek.com/2012/01/27/learning-go-types/#comments Thu, 26 Jan 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/01/27/learning-go-types One of the main reasons I embrace Golang is its simple and concise type system. It follows the principle of least surprise and as per Rob Pike these design choices were largely influenced by the prior experiences.

          In this post, I will discuss some of the main concepts which are essential in understanding Golang's type system.

          Pre-declared Types

          Golang by default includes several pre-declared boolean, numeric and string types. These pre-declared types are used to construct other composite types, such as array, struct, pointer, slice, map and channel.

          Named vs Unnamed Type

          A type can be represented with an identifier (called type name) or from a composition of previously declared types (called type literal). In Golang, these two forms are known as named and unnamed types respectively.

          Named types can have their own method sets. As I explained in a previous post, methods are also a form of functions, which you can specify a receiver.

            type Map map[string]string
          
            //this is valid
            func (m Map) Set(key string, value string){
              m[key] = value 
            }
          
            //this is invalid
            func (m map[string]string) Set(key string, value string){
              m[key] = value 
            }

          You can define a method with named type Map as the receiver; but if you try to define a method with unnamed type map[string]string as the receiver it's invalid.

          An important thing to remember is pre-declared types are also named types. So int is a named type, but *int or []int is not.

          Underlying Type

          Every type do have an underlying type. Pre-declared types and type literals refers to itself as the underlying type. When declaring a new type, you have to provide an existing type. The new type will have the same underlying type as the existing type.

          Let's see an example:

            type Map map[string]string
            type SpecialMap Map

          Here the underlying type of map[string]string is itself, while underlying type of Map and SpecialMap is map[string]string.

          Another important thing to note is the declared type will not inherit any method from the existing type or its underlying type. However, method set of an interface type and elements of composite type will remain unchanged. Idea here is if you define a new type, you would probably want to define a new method set for it as well.

          Assignability

            type Mystring string
            var str string = "abc"
            var my_str MyString = str //gives a compile error

          You can't assign str to my_str in the above case. That's because str and my_str are of different types. Basically, to assign a value to a variable, value's type should be identical to the variable's type. It is also possible to assign a value to a variable if their underlying types are identical and one of them is an unnamed type.

          Let's try to understand this with a more elaborative example:

            package main
          
            import "fmt"
          
            type Person map[string]string
            type Job map[string]string
          
            func keys(m map[string]string) (keys []string) {
              for key, _ := range m {
                keys = append(keys, key)
              }
          
              return
            }
          
            func name(p Person) string {
              return p["first_name"] + " " + p["last_name"]
            }
          
            func main(){
              var person = Person{"first_name": "Rob", "last_name": "Pike"}
              var job = Job{"title": "Commander", "project": "Golang"}
          
              fmt.Printf("%v\n", name(person))
              fmt.Printf("%v", name(job)) //this gives a compile error
          
              fmt.Printf("%v\n", keys(person))
              fmt.Printf("%v\n", keys(job))
            }

          Here both Person and Job has map[string]string as the underlying type. If you try to pass an instance of type Job, to name function it gives a compile error because it expects an argument of type Person. However, you will note that we can pass instances of both Person and Job types to keys function which expects an argument of unamed type map[string]string.

          If you still find assignability of types confusing; I'd recommend you to read the explanations by Rob Pike in the following discussion.

          Type Embedding

          Previously, I mentioned when you declare a new type, it will not inherit the method set of the existing type. However, there's a way you can embed a method set of an existing type in a new type. This is possible by using the properties of annonymous field in a struct type. When you define a annonymous field inside a struct, all its fields and methods will be promoted to the defined struct type.

            package main
          
            type User struct {
              Id   int
              Name string
            }
          
            type Employee struct {
              User       //annonymous field
              Title      string
              Department string
            }
          
            func (u *User) SetName(name string) {
              u.Name = name
            }
          
            func main(){
              employee := new(Employee)
              employee.SetName("Jack")
            }

          Here the fields and methods of User type get promoted to Employee, enabling us to call SetName method on an instance of Employee type.

          Type Conversions

          Basically, you can convert between a named typed and its underlying type. For example:

            type Mystring string
          
            var my_str Mystring = Mystring("awesome")
            var str string = string(my_str)

          There are few rules to keep in mind when it comes to type conversions. Apart from conversions involving string types, all other conversions will only modify the type but not the representation of the value.

          You can convert a string to a slice of integers or bytes and vice-versa.

            []byte("hellø")
          
            string([]byte{'h', 'e', 'l', 'l', '\xc3', '\xb8'})

          More robust and complex run-time type manupilations are possible in Golang using the Interfaces and Relection Package. We'll see more about them in a future post.

          ]]>
          Stack on Go - A Wrapper for Stack Exchange API http://laktek.com/2012/02/06/stack-on-go http://laktek.com/2012/02/06/stack-on-go/#comments Sun, 05 Feb 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/02/06/stack-on-go Regular readers of this blog would know I've been spending my free time to learn Go. Today, I present you the first fruit of those learning experiences. Stack on Go is a wrapper library written in Go for Stack Exchange API.

          When I first stumbled upon the version 2.0 of Stack Exchange API, I felt it as one of the best API designs I've ever seen. So I decided to write a wrapper for it in Go, which was a good way to learn both Golang and modern API design techniques.

          Stack on Go fully implements the Stack Exchange API 2.0 and it is compatible with the Go runtime at Google AppEngine. I hope this could be a good platform for some interesting apps such as notifiers, aggregators and stat analyzers based on Stack Exchange API (well, the possibilities are endless with such a rich dataset).

          Also, bear in mind Stack Exchange is running a competition and offering an iPad2 for the most awesome application submitted (that'd also give Stack on Go a great chance to become the best library ;) ).

          So If you always wanted to learn Go but never got the start, hope this would be a great motivator.

          Installation

          Let's have a look how to get started with Stack on Go.

          Stack on Go is fully compatible with Go1.

          To install the package, run:

            go get github.com/laktek/Stack-on-Go

          Basic Usage

          Once installed, you can use Stack on Go by importing it in your source.

            import "github.com/laktek/Stack-on-Go/stackongo"

          By default, package will be named as stackongo. If you want, you can give an alternate name at the import.

          Stack Exchange API contains global and site specific methods. Global methods can be directly called like this:

            sites, err := stackongo.AllSites(params)

          Before calling site specific methods, you need to create a new session. A site identifier should be passed as a string (usually, it's the domain of the site).

            session := stackongo.NewSession("stackoverflow")

          Then call the methods in scope of the created session.

            info, err := session.Info()

          Most methods accept a map of parameters. There's a special Params type that you can use to create a parameter map.

            //set the params
            params := make(stackongo.Params)
            params.Add("filter", "total")
            params.AddVectorized("tagged", []string("go", "ruby", "java"))
          
            questions, err := session.AllQuestions(params)

          If you prefer, you can pass your parameters directly in a map[string]string literal:

            questions, err := session.AllQuestions(map[string]string{"filter": "total", "tagged": "go;ruby;java"})

          Most methods returns a struct containing a collection of items and meta information (more details available in StackExchange docs ). You can traverse through the results to create an output:

            for _, question := range questions.Items {
                  fmt.Printf("%v\n", question.Title)
                  fmt.Printf("Asked By: %v on %v\n", question.Owner.Display_name, time.SecondsToUTC(question.Creation_date))
                  fmt.Printf("Link: %v\n\n", question.Link)
              }

          You can use the returned meta information to make run-time decisions. For example, you can check whether there are more results and load them progressively.

            if questions.Has_more {
              params.Page(page + 1)
              questions, err = session.AllQuestions(params)
              }

          Authentication

          Stack Exchange follows the OAuth 2.0 workflow for user authentication. Stack on Go includes two helper functions tailored for authentication offered by the Stack Exchange API.

          AuthURL returns you a URL to redirect the user for authentication and ObtainAcessToken should be called from the handler of redirected URI to obtain the access token.

          Check the following code sample, which explains the authentication flow:

            func init() {
              http.HandleFunc("/", authorize)
              http.HandleFunc("/profile", profile)
            }
          
            func authorize(w http.ResponseWriter, r *http.Request) {
              auth_url := stackongo.AuthURL(client_id, "http://myapp.com/profile", map[string]string{"scope": "read_inbox"})
          
              header := w.Header()
              header.Add("Location", auth_url)
              w.WriteHeader(302)
            }
          
            func profile(w http.ResponseWriter, r *http.Request) {
              code := r.URL.Query().Get("code")
              access_token, err := stackongo.ObtainAccessToken(client_id, client_secret, code, "http://myapp.com/profile")
          
              if err != nil {
                fmt.Fprintf(w, "%v", err.String())
              } else {
                //get authenticated user
                session := stackongo.NewSession("stackoverflow")
                user, err := session.AuthenticatedUser(map[string]string{}, map[string]string{"key": client_key, "access_token": access_token["access_token"]})
          
                // do more with the authenticated user
              }
          
            }

          Using with AppEngine

          If you plan to deploy your app on Google AppEngine, remember to do a one slight modification in your code. Since AppEngine has a special package to fetch external URLs you have to set it as the transport method for Stack on Go.

          Here's how to do it:

            import (
              "github.com/laktek/Stack-on-Go/stackongo"
              "appengine/urlfetch"
            )
          
            func main(){
              c := appengine.NewContext(r)
              ut := &urlfetch.Transport{Context: c}
          
              stackongo.SetTransport(ut) //set urlfetch as the transport
          
                  session := stackongo.NewSession("stackoverflow")
              info, err := session.Info()
            }

          Under the Hood

          If you wish to write wrappers for other web app APIs in Go, you might be interested in knowing the implementation details of Stack on Go.

          Actually, the implementation is fairly straightforward. The following method is the essence of the whole library.

            func get(section string, params map[string]string, collection interface{}) (error os.Error) {
              client := &http.Client{Transport: getTransport()}
          
              response, error := client.Get(setupEndpoint(section, params).String())
          
              if error != nil {
                return
              }
          
              error = parseResponse(response, collection)
          
              return
          
            }

          Every method call is routed to above function with the relevant struct, path and parameters provided. Using the path and parameters, we generate the endpoint URL. This is then called using the http.Client methods. Afterwards, the response and the provided struct interface is passed to a custom parser function. There the response body is read and parsed using the JSON.Unmarshall method. The JSON output is finally mapped to the provided struct via the interface. This is what the called method finally returns.

          I used httptest, which is available in Go's standard packages to unit test the library. All API calls were proxied (using a custom Transport) to a dummy server which serves fake HTTP responses. This setup makes it easy to test both request and response expectations easily.

            func createDummyServer(handler func(w http.ResponseWriter, r *http.Request)) *httptest.Server {
              dummy_server := httptest.NewServer(http.HandlerFunc(handler))
          
              //change the host to use the test server
              SetTransport(&http.Transport{Proxy: func(*http.Request) (*url.URL, os.Error) { return url.Parse(dummy_server.URL) }})
          
              //turn off SSL
              UseSSL = false
          
              return dummy_server
            }
          
            func returnDummyResponseForPath(path string, dummy_response string, t *testing.T) *httptest.Server {
              //serve dummy responses
              dummy_data := []byte(dummy_response)
          
              return createDummyServer(func(w http.ResponseWriter, r *http.Request) {
                if r.URL.Path != path {
                  t.Error("Path doesn't match")
                }
                w.Write(dummy_data)
              })
            }

          For those who like to dig deeper the source code is available on GitHub. You can contact me if you need any help in using Stack on Go. Also, feel free to report any issues and improvements.

          ]]>
          Learning Go - Interfaces & Reflections http://laktek.com/2012/02/13/learning-go-interfaces-reflections http://laktek.com/2012/02/13/learning-go-interfaces-reflections/#comments Sun, 12 Feb 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/02/13/learning-go-interfaces-reflections In the post about Go Types, I briefly mentioned that Go provides a way to do run-time type inference using interfaces and reflection package. In this post, we are going to explore these concepts in depth.

          What is an Interface?

          Imagine you stop a random cab expecting to go from A to B. You wouldn't care much about the nationality of the driver, how long he's been in this profession, to whom he voted in the last election or even whether he is a robot; as long as he can take you from A to B.

          This is how the interfaces works in Go. Instead of expecting for a value of particular type, you are willing to accept a value from any type that implements the methods you want.

          We can represent our Cab Driver analogy in Go like this (assuming there are human and robot cab drivers):

          
            type Human struct {
            }
          
            func (p *Human) Drive(from Location, to Location){
              //implements the drive method
            }
          
            type Robot struct {
            }
          
            func (p *Robot) Drive(from Location, to Location){
              //implements the drive method
            }
          
            type Driver interface {
              Drive(from Location, to Location) 
            }
          
            func TakeRide(from Location, to Location, driver Driver){
              driver.Drive(from, to)
            }
          
            func main(){
              var A Location
              var B Location
          
              var random_human_driver *Human = new(Human)
              var random_robot_driver *Robot = new(Robot)
          
              //...
          
              TakeRide(A, B, random_human_driver)
              TakeRide(A, B, random_robot_driver)
            }

          Here we defined a type called Driver which is an interface type. An interface type contains a set of methods. Type supporting the interface, should fully implement this method set.

          In this instance, pointer types of both Human and Robot (*Human and *Robot) implements the Drive method. Hence, they satisfies the Driver interface. So when calling the TakeRide function, which expects a Driver interface as an argument, we can pass a pointer to either Human or Robot types.

          You can assign any value to an interface, if its type implements all methods defined by the interface. This loose coupling allows us to implement new types to support existing interfaces, as well as create new interfaces to work with existing types.

          I recommend you to read the section on Interfaces in Effective Go, if you haven't already. It provides more elaborative examples on the usage of interfaces.

          Encapsulation

          Another benefit of Interfaces are they can be used for encapsulation. If a type is only implements the methods of a given interface, it's ok to export only the interface without the underlying type. Obviously, this is helpful in maintaining a cleaner and concise API.

          In our previous Cab Driver example, both Human and Robot types doesn't have any other methods apart from the Drive method. Which means we can export only the Driver interface and keep human and robot types encapsulated to the package (hm...a paranoid cab company which doesn't reveal the true identities of the drivers!).

          
            // ...
            // Type & method declarations were skipped
          
            func TakeRide(from Location, to Location, driver Driver){
              driver.Drive(from, to)
            }
          
            func NewDriver() Driver {
              // this constructor will assign 
              // a random value of type *human or *robot
              // to Driver interface.
          
              return
            }
          
            func main(){
              var A Location
              var B Location
          
              var random_driver Driver = NewDriver() 
          
              //...
          
              TakeRide(A, B, random_driver)
            }

          We have introduced a new function called NewDriver() which will return a Driver interface. Value of the Driver interface could be either *human or *robot.

          Empty Interface - interface{}

          It's possible to define an interface without any methods. Such an interface is known as an empty interface, and it's denoted by interface{}. Since there are no methods, any type will satisfy this interface.

          I'm sure most of you are familiar with the fmt.Printf function; which accepts variable number of arguments of different types and produce a formatted output. If we take a look at its definition, it accepts variable number of empty interface(interface{}) values. This means, Printf is using a mechanism based on empty interfaces to infer the types of values at run-time. If you read through the next sections, you will get a clue how it does that.

          Type Assertion

          In Go, there's a special expression, which let's you assert the type of the value interface holds. This is known as Type Assertion.

          In our Cab Driver example, we can use type assertion to verify whether the given driver is a human.

            var random_driver Driver = NewDriver() 
            v, ok := random_driver.(*human)

          If the NewDriver() method returns an interface with the value of type *human; v will be assigned with that value and value of ok will be true. If NewDriver() returns a value of *robot type; v will be set to nil and value of ok will be false.

          With type assertion it is possible to convert one interface value to another interface value too. For the purpose of this example; let's assume there's another interface called Runner which defines a Run method and our *human type also implements the Run method.

            var random_driver Driver = NewDriver() 
            runner, ok := random_driver.(Runner)

          Now when the Driver interface is contained with a value of *human; it is possible for us to convert the same value to be used with the Runner interface too.

          Type Switches

          Using type assertions, Go offers a way to do different actions based on a value's type. Here's the example given in Gospec for type switching:

            switch i := x.(type) {
            case nil:
              printString("x is nil")
            case int:
              printInt(i)  // i is an int
            case float64:
              printFloat64(i)  // i is a float64
            case func(int) float64:
              printFunction(i)  // i is a function
            case bool, string:
              printString("type is bool or string")  // i is an interface{}
            default:
              printString("don't know the type")
            }

          Type switching uses a special form of type assertion, with the keyword type. Note that this notation is not valid outside of type switching context.

          Reflection Package

          Combining the power of empty interfaces and type assertions, Go provides Reflection package which allows more robust operations on types and values during the run-time.

          Basically, reflection package gives us the ability to inspect the type and value of any variable in a Go program.

            import "reflect"
          
            type Mystring string
          
            var x Mystring = Mystring("awesome")
            fmt.Println("type:", reflect.TypeOf(x))
            fmt.Println("value:", reflect.ValueOf(x))

          This gives the output as:

            type: main.Mystring
            value: awesome

          However, this is just the tip of the iceberg. There are lot more powerful stuff possible with the Reflection package. I'll leave you with the "Laws of Reflection" blog post and Godoc of the relection package. Hope its capabilities will fascinate you.

          Further Reading

          ]]>
          Learning Go - Functions http://laktek.com/2012/02/23/learning-go-functions http://laktek.com/2012/02/23/learning-go-functions/#comments Wed, 22 Feb 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/02/23/learning-go-functions Functions are first-class citizens in Go. In the very first post of this series, we learnt that Go functions can return multiple values. Apart from multiple return values, there are several other interesting features in Go functions that's worth exploring.

          Higher-order Functions

          In Go, a function can take another function as an argument and also define a function as a return type. This feature, which is known as higher-order functions in functional programming, is a great way to define dynamic and reusable behavior.

          For example, Map function from the strings package takes a mapping function as its first argument. We can come up with a simple cipher algorithm by passing a function to Map.

            output := strings.Map(func(c int) int {
                  alphabet := "abcdefghijklmnopqrstuvwxyz"
                  cryptabet := "THEQUICKBROWNFXJMPSVLAZYDG"
          
                  return utf8.NewString(cryptabet).At(strings.Index(alphabet, string(c)))
              }, "hello")

          What if you want to encode your message with a different cryptabet? How about something fancy like Sherlock Holmes' Little Dancing Men? Current implementation doesn't have that flexibility, but let's try to extend our code.

            func CipherGenerator(cryptabet string) func(int) int {
              return func(c int) int {
          
                alphabet := "abcdefghijklmnopqrstuvwxyz"
                encoded_cryptabet := utf8.NewString(cryptabet)
          
                return encoded_cryptabet.At(strings.Index(alphabet, string(c)))
          
              }
            }
          
            func main() {
              fmt.Printf(strings.Map(CipherGenerator("☺☻✌✍✎✉☀☃☁☂★☆☮☯〠☎☏♕❤♣☑☒✓✗¢€"), "hello"))
            }

          So we created a function called CipherGenerator. It can accept any unicode string as a cryptabet and return a function of type func(int) int which is assignable as a mapping function to strings.Map.

          User-Defined Function types

          When you define a function to return another function, its signature can get little too complex. In the previous example, the function signature was func CipherGenerator(cryptabet string) func(int) int. This not easy to comprehend at a glance. We can make this more readable by declaring a named type for the returning function.

            type MappingFunction func(int) int

          Now we can declare the CipherGenerator function like this:

            func CipherGenerator(cryptabet string) MappingFunction {
              // ...
            }

          Since the underlying type matches, it is still assignable to the strings.Map.

          Closure

          In the function literal that's returned from the CipherGenerator, we are referencing to the variable cryptabet. However, cryptabet is defined only in the scope of CipherGenerator. Then how does returned function has access to cryptabet each time it runs?

          This property is known as Closure>). In simple terms, a function will inherit the variables from the scope it was declared.

          Applying the same principle, we can also move the variable alphabet from the scope of returning function to the scope of CipherGenerator.

            func CipherGenerator(cryptabet string) MappingFunction {
              alphabet := "abcdefghijklmnopqrstuvwxyz"
          
              return func(c int) int {
          
                encoded_cryptabet := utf8.NewString(cryptabet)
          
                return encoded_cryptabet.At(strings.Index(alphabet, string(c)))
          
              }
            }

          Deferred Calls

          In Go functions you can define a special statement called defer. Defer statements invoke function or method calls immediately before the surrounding function (function that contains the defer statement) returns.

          Defer statements are most commonly used for cleanup jobs. Its execution is ensured whatever return path your function takes.

            func Read(reader io.ReadCloser) string {
              defer reader.Close()
          
              // ...
            }

          Defer statement evaluates the parameters to the function call at the time it's executed. However, the function call isn't invoked until the surrounding function returns. Check the example below.

            func FavoriteFruit() (fruit string) {
              fruit = "Apple"
              defer func(v string) {
                fmt.Printf(v)
              }(fruit)
          
              fruit = "Orange"
          
              return
            }

          Though the value of fruit is later changed to Orange, defer function call will still print Apple. This is because it was the value fruit held, when the defer statement was executed.

          Since we can use function literals in defer statements, closure property applies to them too. Instead of passing the variable fruit in the defer function call, we can directly access it from the deferred function.

            func FavoriteFruit() (fruit string) {
              fruit = "Apple"
              defer func() {
                fruit = "Orange"
              }()
          
              return
            }

          What is more interesting here is deferred call actually modifies the return value of the surrounding function. This is because the deferred call is invoked before return values are passed to the caller.

          You can have mulitple defer statements within a function.

            func f() (output string) {
              defer fmt.Printf("first deferred call executed.") 
              defer fmt.Printf("second deferred call executed.") 
          
              return "Function returned"
            }
          
            fmt.Printf(f())
          
            // Output:
            // second deferred call executed.
            // first deferred call executed.
            // Function returned

          As you can see from the output, defer statements are executed in the Last-In-First-Out(LIFO) order.

          Variadic Functions

          If you've noticed, you can invoke fmt.Printf with variable number of arugments. On one instance, you may simply call it as fmt.Printf("Hello") and in another as fmt.Printf("Result of %v plus %v is %v", num1, num2, (num1 + num2)). Since you have to define the parameters when declaring a function signature, how this is possible?

          You can define the last parameter to a function with a type prefixed with .... Then it means the function can take zero or more arguments of the given type. Such functions are known as variadic functions.

          Here's a simple variadic function to calculate the average of a series of integers.

            func Avg(values ...int) float64 {
              sum := 0.0
              for _, i := range values {
                sum += float64(i)
              }
              return sum / float64(len(values))
            }
          
            Avg(1, 2, 3, 4, 5, 6) //3.5

          In a variadic function, arguments to the last parameter is collected to a slice of the given type. We can directly pass an already composed slice instead of individual values to the variadic function. However, argument should be appended with a ....

            values := []int{1, 2, 3, 4, 5, 6}
            Avg(values...) //3.5

          By using blank interface type with variadic functions, you can come up with very flexible functions similar to fmt.Printf.

          Further Reference

          Go Coderwalk on First-Class Functions. http://golang.org/doc/codewalk/functions/

          ]]>
          A Few cURL Tips for Daily Use http://laktek.com/2012/03/12/curl-tips-for-daily-use http://laktek.com/2012/03/12/curl-tips-for-daily-use/#comments Sun, 11 Mar 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/03/12/curl-tips-for-daily-use Though I knew cURL as a powerful tool; so far I never made an attempt to get familiar with it. Most of the time, I would just wade through its man pages to find a way get my stuff done. Recently I found myself make use of it for many of my daily tasks. By those excess usage, couple of recurring patterns emerged.

          If you are already familiar with cURL, you may not find anything interesting or new here (but feel free to point out any improvements or other useful tips in comments).

          Resume failed downloads

          cURL has this handy option (-C or --continue-at) to set a transfer offset, which helps to resume failed downloads. On most cases, setting the offset as a single dash, will let cURL to decide how to resume the download.

            curl -C - -L -O http://download.microsoft.com/download/B/7/2/B72085AE-0F04-4C6F-9182-BF1EE90F5273/Windows_7_IE9.part03.rar

          It's a shame that I came to know about this very recently. I would now be cursing lot less at my ISP.

          Fetch request body from a file

          Nowadays, most web service APIs demand request bodies to be formatted as JSON. Manually entering a JSON formatted string in command-line is not a very convenient option. Better way to do it would be to prepare the request body in a file and provide it to cURL.

          Here's an example of creating a gist, providing the payload from a JSON file.

            curl -d @input.json https://api.github.com/gists

          Start the data parameter with a @ to tell cURL, it should fetch the file in given path.

          Mimic AJAX requests

          Sometimes I need create endpoints in web apps, that produces alternate responses when accessed via AJAX (eg. not rendering the layout). Testing them directly in browser is not much viable as it require bootstrapping code. Instead, we can mimic AJAX requests from cURL by providing X-Requested-With header.

            curl -H "X-Requested-With: XMLHttpRequest" https://example.com/path

          Store and Use Cookies

          Another similar need is to test the behavior of cookies. Especially, when you want to alter a response depending on a cookie value.

          You can use cURL to download the response cookies to a file and then use them on the subsequent requests. You can inspect the cookie file and even alter it to test the desired behavior.

            curl -L -c cookies.txt http://example.com 
            curl -L -b cookies.txt http://example.com

          View a web page as GoogleBot

          When I was running this blog with Wordpress, Google marked it as a site infected with malware. Panicked, I visited the site and checked the source. I couldn't see anything suspicious. Later I discovered, the malware is only injected to the site only when it is accessed by the GoogleBot. So how do you see a site's output as GoogleBot?

          cURL's option (-A or --user-agent) to change the user-agent of a request comes handy on such instances. Here's how you can impersonate GoogleBot:

            curl -L -A "Googlebot/2.1 (+http://www.google.com/bot.html)" http://example.com

          Peep into others' infrastructure

          This is not exactly a cURL feature, but comes in handy when I want to find out what others' use to power their apps/sites.

            curl -s -L -I http://laktek.com | grep Server

          Yes, now you know this blog runs on AmazonS3 :).

          ]]>
          Punch - A Fun and Easy Way to Build Modern Websites http://laktek.com/2012/04/19/punch-a-fun-and-easy-way-to-build-modern-websites http://laktek.com/2012/04/19/punch-a-fun-and-easy-way-to-build-modern-websites/#comments Wed, 18 Apr 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/04/19/punch-a-fun-and-easy-way-to-build-modern-websites Few months ago, I switched this blog from Wordpress to Jekyll. I love how Jekyll allows me to prepare everything locally and simply publish when it's ready. There's no server-side logic involved, which means I can host the whole blog in a S3 bucket. Also, there's no more worries on mundane issues like security, performance or database corruptions.

          I wanted to have this freedom and control in any site that I would create and manage. Actually, most websites can be thought as static sites. They may contain few bits and pieces that needs to be rendered dynamically. The rise of modern browsers means we can pass this concern to the client-side.

          However, websites have different requirements from a blog. It contains different pages carrying unique presentation along with few reusable blocks (for example, take a look at different pages in 37Signals site ). A page will not contain just a title and block of text, but composed of different types of content such as lists, images, videos, slides and maps. Also, a modern website often needs to be A/B tested and translated in to other languages. Even though, we use can use blog engines like Jekyll (or Wordpress) I felt there's still a void for a tool tailored to create and manage websites.

          That's why I created Punch.

          What is Punch?

          Aim of Punch is to help anyone to build (and maintain) modern websites using only HTML, CSS and JavaScript. Punch is largely inspired from Jekyll, but it's not a blog engine. It's intuitive to use and easy to extend.

          Punch is written with Node.js and will work with your local file system. Currently, Punch renders template files written in Mustache. It expects content to be available in JSON format. Punch can also parse content in Markdown format.

          To generate a site, templates and contents should be available in two directories. For each Mustache template found in the templates directory, Punch will look for relevant content in the contents directory. Contents should be presented in a JSON file having the same name as the template. Alternatively, you can create a directory with the same name as template to store multiple JSON and Markdown files. Punch will save the rendered file in the output directory. Any other files(HTML, images, JS, CSS, etc.) and directories under templates will be copied to the output directory without any modification.

          How to Use

          Here's a quick screencast on how to use Punch.

          For more details on installation and usage, please refer the README and the User Guide.

          Easy Client-side Rendering

          As I mentioned earlier, we can render any dynamic blocks in a site on the client-side. Since Punch is written in JavaScript, we can easily use its renderer on client-side as well. You can see this feature is used in the Punch's homepage to render the "GitHub Watchers" block.

          To use the renderer, you must include Mustache.js and Punch's Mustache renderer in the HTML page.

            <script type="text/javascript" src="assets/mustache.js"></script>
            <script type="text/javascript" src="node_modules/punch/lib/renderers/mustache.js"></script>

          Then you have to initate a new instance of MustacheRenderer and provide the template, content and any partials you need to render. Since Punch's renderer works asynchronously it can be used reliably in contexts which involves AJAX content fetching. You can see below, how I pass the JSON response from the GitHub API and template fragment to the renderer to render the GitHub Watchers block. We must also provide a callback function to execute after rendering is done.

            // Load and Render GitHub followers
            (function(){
              if($("#github_followers").length > 0){
                var renderer = new MustacheRenderer();
          
                renderer.afterRender = function(output){
                  $("#github_followers").html(output);
                };
          
                renderer.setTemplate('{{#followers}} \
                                      <a href="http://github.com/{{login}}" rel="nofollow"><img size="16" src="{{avatar_url}}" title="{{login}}" alt="{{login}}"/></a> \
                                      {{/followers}} \
                                    ');
                renderer.setPartials({});
          
                $.getJSON("https://api.github.com/repos/laktek/punch/watchers", function(data){
                  renderer.setContent({"followers": data});
                });
              }
            })();

          Start Playing!

          I have been already using Punch to create few personal sites and just finished porting CurdBee's public site to Punch. Every time I used Punch it has been a pleasant experience and I feel I have the freedom and control to create sites the way I want. I hope most of you will start to feel the same with Punch. Also, I have plans to support features such as browser-based content editing and easy publishing options, which could make Punch more awesome.

          Go install it, build nice sites, spread the word and send me pull requests!!

          ]]>
          Create Quick HTML5 Presentations with Punch http://laktek.com/2012/04/27/create-quick-html5-presentations-with-punch http://laktek.com/2012/04/27/create-quick-html5-presentations-with-punch/#comments Thu, 26 Apr 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/04/27/create-quick-html5-presentations-with-punch HTML5 presentations are cool and convenient. They just work in the browser and thanks to CSS3 & JavaScript they can be made to look even better than the traditional slides. It's easy to link to individual slides. Also, demos (if you're showing something related to web technologies) can done on the slides itself. There are already couple of good frameworks such as Google's HTML5Slides, deck.js and impress.js to create HTML5 presentations.

          However, in all above frameworks, you have to define the slides as HTML blocks (usually wrapped inside a div or a section). It's fairly trivial; but you may feel editing the slides gets messy and verbose when you have dozens or more slides. Especially, if you are trying to put together a presentation in a rush (which you should try to avoid).

          I felt the same while creating the presentation on Punch, which I recently presented at RefreshColombo. Then I realized Punch itself can be used to make the process of creating HTML5 presentations quick and painless.

          This is what I did. Rather than adding the slides in to the HTML page, I created them in a separate JSON file. Here's how it was structured.

            {
              "slides": [
                {
                "slogan": "Main Title of Presentation"
                },
          
                {
                  "slogan": "Title of the Slide"
                , "primary_text": "A summary or sub-title"
                },
          
                {
                  "slide_title": "Title of the Slide"
                , "primary_text": "This is a slide with just text. This is a slide with just text."
                },
          
                {
                  "slide_title": "Title of the Slide"
                , "image": "cat_picture.jpg"
                , "footnote": "Source: http://www.flickr.com/photos/splityarn/2363974905/"
                }
              ]
            }

          Then I saved the HTML page for slides as a Mustache template. There I added a section for slides, which can be rendered iteratively based on different content (as defined in the above JSON). Here's the mustache portion of the template which would be rendered:

          
            ...
          
            {{#slides}}
            <article class="{{class}}" >
          
              {{#slogan}}
                <h2>{{{slogan}}}</h2>
              {{/slogan}}
          
              {{#slide_title}}
                <h3>{{{slide_title}}}</h3>
              {{/slide_title}}
          
              {{#primary_text}}
                <p class="primary">{{{primary_text}}}</p>
              {{/primary_text}}
          
              {{#image}}
                <img src="images/{{image}}" class="centered"/>
              {{/image}}
          
              {{#code}}
                <pre>{{code}}</pre>
              {{/code}}
          
              {{#list}}
                <ul class="build">
                  <li>{{{.}}}</li>
                </ul>
              {{/list}}
          
              {{#secondary_text}}
                <p class="secondary">{{{secondary_text}}}</p>
              {{/secondary_text}}
          
              {{#footnote}}
                <p class="source">{{{footnote}}}</p>
              {{/footnote}}
          
            </article>
            {{/slides}}
          
            ...

          Finally, I ran punch command to generate the following HTML5 Presentation - http://laktek.com/presentations/punch/slides.html

          I have created a simple bootstrapper, by extracting the common slide formats from my presentation, which you can use straight away. I used a slightly modified version of Google's HTMl5Slides. If you are happy with the default styles, you can just edit the contents/slides.json file and replace it with your slides. When you are done, run the punch command in the top-most directory to generate the HTML output.

          You can edit the default template (templates/slides.mustache) to define any additional blocks to use in the slides or even change the template entirely to use a different presentation framework such as deck.js or impress.js.

          The source is available in GitHub. Fork it and use!

          ]]>
          Rapidly Prototyping Web Applications with Punch http://laktek.com/2012/05/17/rapidly-prototyping-web-applications-using-punch http://laktek.com/2012/05/17/rapidly-prototyping-web-applications-using-punch/#comments Wed, 16 May 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/05/17/rapidly-prototyping-web-applications-using-punch Process of prototyping a web application is two-fold. You will have to focus on the front-end, as well as the underlying data layer which powers it. Representation of the underlying data layer is what we usually call as the API. Front-end should know the types of data exposed by the API and also, API should know what data it should expose to the front-end. So it is essential to prototype these two layers hand in hand.

          However, it's extremely difficult peel out and handle these two layers seamlessly with existing prototyping tools. After trying out different prototyping techniques, I finally settled with a simple workflow which is based on Punch.

          In this post, I'll explain how to use this workflow to prototype a web application.

          Start with the Front-end

          First of all, we should create a new project for the prototyping task. It can be done by running:

            > punch setup simple_issues

          I thought it would be cool to create a prototype for a simple issue tacker similar to GitHub Issues, hence the name simple issues. Inside the simple_issues directory you will find another 2 directories namely - contents & templates and a config file.

          Let's start off by prototyping the interface to show a list of issues. We shall create a file named issues.html in the templates directory. Personally, I used a boilerplate code based on HTML5 boilerplate and Twitter Bootstrap to build the prototype. Twitter Bootstrap is great for this kind of quick prototyping tasks. It's trivial to select a layout, add visual components and even assign basic interactivity to them. It allows us to focus on the overall arrangement, rather than spending time tinkering minor details.

          After adding all essential UI blocks to the prototype, we can proceed to check the result on the browser. For this, we need to start Punch's development server (this is actually a new feature introduced in Punch, so make sure you have the latest version installed).

            > cd simple_issues 
            > punch server

          Punch server will start on port 9009 by default. You can visit http://localhost:9009/issues to view the result.

          Here's the initial prototype I came up with.

          Extract the API

          As I said before, front-end interface only covers part of the prototyping process. We need to figure out the API as well. For this, we have to look what are the real data needed to make the front-end meaningful.

          In this example, list of issues are our main concern. Guided by the front-end, we can come up with a basic representation of the API response for a list of issues. Note that we use JSON as the data format for our API.

            {
              "issues": [
          
                {
                    "id": 8
                  , "permalink": "http://localhost:9009/issue"
                  , "title": "Any way to get Punch on windows?"
                  , "user": {
                      "name": "sc0ttwad3" 
                    }
                  , "state": "closed"
                }
            }

          We save our API prototypes in the contents directory using the same name as its front-end prototype (in this case issues.json).

          In the original front-end prototype we had the list of issues hard-coded as a table:

            <table class="table table-striped">
              <thead>
                <tr>
                  <th>#</th>
                  <th>Description</th>
                  <th>Reported By</th>
                  <th>State</th>
                </tr>
              </thead>
          
              <tbody>
          
                <tr>
                  <td>8</td>
                  <td><a href="#">Any way to get Punch on windows?</a></td>
                  <td>sc0ttwad3</td>
                  <td>Closed</td>
                </tr>
          
              </tbody>
            </table>

          We shall replace those table rows with a Mustache template, so that we can hook the API prototype we just created.

            {{#issues}}
            <tr>
              <td>{{id}}</td>
              <td><a href="{{permalink}}">{{title}}</a></td>
              <td>{{user.name}}</td>
              <td><span class="label">{{state}}</span></td>
            </tr>
            {{/issues}}
            {{^issues}}
            <tr>
              <td colspan="4">Hooray! You have no issues.</td>
            </tr>
            {{/issues}}

          Remember to save the protoype page again with the name "issues.mustache". Otherwise Punch will not know it needs to be generated.

          Refine and Repeat

          Now you can try changing the issues.json file to see how it reflects on the front-end prototype. You can refine the front-end until you're satisfied with the representation of data. Also, you can tweak the API to suit to the requirements of the front-end.

          Since Punch's development server generates the site on each request, you can check these changes immediately. Check the following demo to understand the full effect:

          Reuse Parts

          Another benefit of using Punch for prototyping is you can easily reuse the parts that are repeated in prototypes. For example, if we want to prototype the page for an individual issue, we can reuse the header and footer sections we created earlier in the issues page. Move the repeatable block into a new file and give it a name starting with an underscore (eg. _header.mustache). Then Punch will treat it as a partial.

          Here's the sample template for a single issue, which includes header and footer as partials:

          
            {{> header }}
          
            <div class="row-fluid">
          
              {{#issue}}
              <div class="span12">
                <h2>{{title}}</h2>
                <span>Reported by {{user.name}} on {{created_on}}</span>
                <p>{{{body}}}<p>
              </div>
              {{/issue}}
          
              <hr/>
          
              {{#comments}}
              <div class="comment">
                <span>{{user.name}} said ({{created_on}}):</span>
                <p>{{body}}</p>
              </div>
              {{/comments}}
          
            </div>
          
            {{> footer }}

          Moving on to Implementation

          Once you are done with prototyping of the front-end and API for a feature, you can start the implementation of it. Designers can use the front-end prototype as the base for the actual template implementation. Since Mustache templates doesn't embed any logic, its definition can be easily translated in to any other templating language you prefer (such as ERB, Jade & etc).

          Similarly, the developers can use the prototyped API responses, as the expectations of the actual API implementation. Prototyping this way will help to reduce lot of impedance mismatches that normally arise during the integration of front-end and API.

          ]]>
          Winning People with Code http://laktek.com/2012/05/23/winning-people-with-code http://laktek.com/2012/05/23/winning-people-with-code/#comments Tue, 22 May 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/05/23/winning-people-with-code Last Monday, myself and Laknath, met with Jervis and his lovely wife Himashini, for what it turned out to be the first unofficial meetup of Punch users.

          Jervis is an electrical engineer from Australia and he runs his own company, where he teaches Python to other electrical engineers (well, it was fascinating to know that our power plants are controlled from Python scripts). Jervis has been one of the early adopters of Punch and he contributes to make it better. Couple of weeks back, he emailed me saying he's visiting Sri Lanka and would love to meet me for a coffee. I was flattered by that compliment itself.

          So on last Monday, Jervis actually came to Sri Lanka with his wife and we met in the evening at Tintagel. It was amazing to listen Jervis enthusiastically explaining how he uses Punch to host his video tutorials and how it simplified his workflow. We discussed about the future of the project, what can be improved and what other interesting tools we can integrate with Punch. Our chat didn't only stick to Punch. We moved on to many other interesting topics such as hacker culture, how to learn human languages quickly (thanks to his wife, Jervis can speak Sinhala quite well) and even how to make ice-cream at home (meanwhile, our evening got extended from a coffee to a wonderful dinner). Jervis and Himashini were really nice people and we shared a lot in common. I'm glad a project like Punch, helped us to be friends.

          This is the beauty of writing and releasing code. Many still think of coding as an isolated battle with a dumb machine. Very few would believe, if you say you can win people with code. But it's true.

          Almost all the projects I open-sourced, were originally written just to scratch my itch. I never thought that others will find them useful or contribute back to make them better. But the messages I receive via GitHub says otherwise. I realized that with every piece of code we write has the potential to make some kind of an impact on others' lives. I see code as a powerful form of expression, which can bring people around the world together.

          If you love your code, share it. The feeling you will get when others fall in love with your work is just amazing.

          ]]>
          Don't Fret Over Client-Side Rendering http://laktek.com/2012/05/31/dont-fret-over-client-side-rendering http://laktek.com/2012/05/31/dont-fret-over-client-side-rendering/#comments Wed, 30 May 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/05/31/dont-fret-over-client-side-rendering Twitter is ditching client-side rendering and moving back to server-side. Does this mean client-side rendering is bad? Should we also move back to server-side? Well, it's insane to have FUD on client-side rendering just because it didn't work for Twitter (remember Rails can't scale?). Rather, make it a lesson on how to use client-side rendering sensibly.

          "In our fully client-side architecture, you don’t see anything until our JavaScript is downloaded and executed."
          - Twitter Engineering Blog

          In an essence, this was the problem. When serving a web request, it's important to get the most essential information quickly before users' eyes. Best way to achieve this would be to identify and render the essential blocks in the server itself. You can then delegate the rest to be rendered on the client.

          We take this approach for CurdBee. Screenshot below shows how CurdBee's Home screen is displayed on a slow connection or when JavaScript is disabled.

          screenshot of CurcBee home screen

          On most instances, users come to Home screen only to proceed to another action such as create invoice or add time entry. By rendering navigation blocks on server, we allow them to do this action immediately. Since remaining blocks are also rendered progressively, users can start interacting without having to wait till everything is loaded.

          We use a very simple client-side rendering logic. Templates and data(JSON) are fetched and rendered, only when they are actually needed on screen. There's no eager loading involved. This helps us to keep the bootstrapping JavaScript code to minimal, which loads and parses faster. Also, rendering happens asynchronously allowing us to progressively render the blocks. We use appropriate caching for the templates and data to make the responses fast.

          So if used properly, client-side rendering can actually improve the overall user experience. Rather than trying to do everything in server-side or client-side, figure out how you can have the best in the both worlds.

          ]]>
          Punch Status - Instant Previews, Easy Publishing & More http://laktek.com/2012/06/30/punch-status http://laktek.com/2012/06/30/punch-status/#comments Fri, 29 Jun 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/06/30/punch-status It's been 2 months since I originally released Punch. I was encouraged from the initial feedback it received and ben commiting most of my free time to improve it further. In the past two months, Punch made lot of progress and there were several nice features added to the project.

          Punch focus on making a simple and better workflow to create, preview and publish websites. This new quick screencast will help you to get a better understanding on Punch's workflow:

          Here are some of the important features that were introduced recently.

          Instant Previews

          Punch now ships with a built-in development server. This will re-generate the site before each page request, making it possible to preview changes in the browser instantly.

          To start the server, run the following command in your site's directory:

              punch s

          This will start a server listening to port 9009. If you want, you can provide an alternate port when starting the server.

              punch s 3000

          Easy Publishing

          A site created with Punch is just a collection of static files. You can publish it by just copying the files to a web host. Now, Punch makes publishing process even more simple for you. All you need to do is run a single command (punch p) to publish your site. Punch can handle publishing to Amazon S3 or any remote server that supports SFTP protocol.

          To learn more details about publishing, check this wiki article.

          New Homepage & Boilerplate

          Punch now boasts a beautiful homepage, thanks to the great work done by Collin Olan. I hope the new homepage will add a positive vibe to the project.

          Also, Collin created Backfist, a boilerplate to create sites with Punch. It's a good base for you to get started and learn the conventions of Punch.

          Guide Wiki

          One of the common problems among most open source projects is outdated and scattered documentation (in blog posts, issue tickets, README, Wiki and even word of mouth). I thought it would be better if we set a convention early to avoid this happening to Punch.

          So I created Punch wiki, which will serve as the official guide for the project.

          Future

          I'm really happy the way project progrssed in its first two months. I've plans to make this even more awesome in the days to come. First step would be to get more people to try Punch and convince them to make it part of their toolbox.

          So if you are already using Punch, spread the good about it to others. If you see any issues and improvements, please feel free to report and help to get them fixed.

          Remember, behind every successful open-source project is a great community!

          ]]>
          SPDY for Rest of Us http://laktek.com/2012/07/10/spdy-for-rest-of-us http://laktek.com/2012/07/10/spdy-for-rest-of-us/#comments Mon, 09 Jul 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/07/10/spdy-for-rest-of-us There had been lot of buzz around SPDY lately, which is a brand new protocol introduced by Google to make the web faster. Most major browsers and web servers have begun to make releases with SPDY support. So I felt it's worthwhile to explore more about SPDY and how it matters to us as web developers.

          What is SPDY?

          The concept of world-wide web was developed on top of the HTTP protocol. It's an application layer protocol that uses TCP as the transport layer (if this layered concept confuses you, read about the OSI model ). HTTP is stateless; which means for each request, client has to create a new connection with the server. Though it initially helped to keep things simple, it causes issues with current demands of the web.

          SPDY is designed to address these shortcomings in the HTTP protocol. It uses a single connection for all request/response cycles between the client and server.

          Core of SPDY is the framing layer which manages the connection between two end-points and the transferring of data. There can be multiple data streams between the two end-points. On top of the framing layer, SPDY implements the HTTP request/response semantics. This gives us the possibility to use SPDY to serve any existing web sites with little or no modification.

          What are the benefits of using SPDY?

          • Since there's no connection overhead for each request, response latency would be low.

          • Apart from the content body, SPDY also compresses the request/response headers. This will be useful on mobile and other slow connections, where every byte would count.

          • The way SPDY is designed mandates to use SSL for all communications between client and server, which means the sites would be more secure.

          • Rather than waiting till client initiate a request, server can push data to client with SPDY. This will help the sites to do smart prefetching of resources.

          Is it production ready?

          Many Google's apps and Twitter already uses SPDY to serve their content. Amazon Silk browser is also known to use SPDY powered proxy.

          SPDY sessions in Chrome

          Visit the following URL in Chrome, to see which sites are currently using SPDY.

              chrome://net-internals/#spdy

          How can I add SPDY support for my site?

          Easiest way to add SPDY support is by enabling mod_spdy for Apache. Mod_spdy alters Apache's default request/response cycle via hooks and add SPDY handling.

          Speed gains by switching to SPDY may vary depending on the existing configurations of your site. If you are using HTTP performance optimizations such as wildcard asset hosts (ie. assets%d.example.com), that could cause SPDY to create multiple connections with the same end-point, thus reduce the efficiency. Some browsers like Chrome handles this smartly by pooling the connections. Also, CDNs such as Amazon Cloudfront still doesn't support SPDY, so those resources needs to be loaded using HTTP connections.

          You can use a tool like Chromium Page Benchmarker, to check how your site performs with and without SPDY under different configurations.

          Screenshot of Page Benchmarker

          Do I need to have a SSL certificate?

          In the initial connection, mod_spdy uses SSL handshake and Next Protocol Negotiation (NPN) to notify SPDY availability to the client. Also, to work across existing HTTP proxies SPDY requires data streams to be tunneled through SSL. This means currently you will need to have a valid SSL certificate for your site to support SPDY.

          However, it is possible to test SPDY locally without SSL. For this, you can run Chrome with the flag --use-spdy=no-ssl and you may use a SPDY server implementation that works without SSL.

          Does SPDY help in building real-time web apps?

          It's worth noting that SPDY doesn't provide any special benefits for the use of WebSockets. Though, they might look similar in the higher level descriptions, they are totally independent protocols. They are created for different purposes and even the internal framing algorithms of the two are different.

          On the other hand, SPDY's inherently asynchronous streams will help in implementing features such as Server-Sent Events.

          SPDY is not a silver bullet!

          SPDY will continue to improve as a more stable protocol and with the time it will succeed the HTTP protocol. Unless you run a heavyweight site, there won't be any immediate effect by supporting SPDY to your conversions or cost savings.

          So it's always better to start with the general page optimization techniques and then consider SPDY if you still want to cut down those extra milliseconds.

          Further Reading:

          ]]>
          Distraction Free Writing with Vim http://laktek.com/2012/09/05/distraction-free-writing-with-vim http://laktek.com/2012/09/05/distraction-free-writing-with-vim/#comments Tue, 04 Sep 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/09/05/distraction-free-writing-with-vim I recently configured Vim to switch to a special writing mode upon opening Markdown files. It helps me to keep the focus while writing avoiding distractions. I first saw this concept in iA Writer and immediately fell in love with it. However, my muscle memory was too tied with Vim and I didn't want to switch to a different editing environment solely for this purpose. Hence, I tried to make Vim to behave in a similar manner.

          Check this screencast to see it in action:

          For anyone interested, these are the steps I took to setup it. Please note that I tried this with MacVim on OS X Lion. It may not work as expected in other versions or other OSs.

              " turn-on distraction free writing mode for markdown files
              au BufNewFile,BufRead *.{md,mdown,mkd,mkdn,markdown,mdwn} call DistractionFreeWriting()
          
              function! DistractionFreeWriting()
                  colorscheme iawriter
                  set background=light
                  set gfn=Cousine:h14                " font to use
                  set lines=40 columns=100           " size of the editable area
                  set fuoptions=background:#00f5f6f6 " macvim specific setting for editor's background color 
                  set guioptions-=r                  " remove right scrollbar
                  set laststatus=0                   " don't show status line
                  set noruler                        " don't show ruler
                  set fullscreen                     " go to fullscreen editing mode
                  set linebreak                      " break the lines on words
              endfunction
          • I also added this setting to .vimrc, to toggle SpellChecking in normal mode.
              :map <F5> :setlocal spell! spelllang=en_us<CR>
          • Installed the Cousine font. It's a free alternative to Nitti Light, the font used by iA Writer.

          • Turned off Mac OS X's native full-screen mode for MacVim (otherwise the custom background color is not applied).

              defaults write org.vim.MacVim MMNativeFullScreen 0

          You can find the customized versions of all needed files in this Git repo.

          Update: I extracted the distraction free mode settings into its own plugin and allowed it to be toggled from the F4 key (now it won't be forced upon opening Markdown files). Check the repo in GitHub for updated settings.

          ]]>
          Punch Reloaded http://laktek.com/2012/09/18/punch-reloaded http://laktek.com/2012/09/18/punch-reloaded/#comments Mon, 17 Sep 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/09/18/punch-reloaded Punch started out as a simple static site generator, which I wrote to use in my own work. After releasing to the public, it attracted bunch of passionate early adopters, who provided some valuable feedback on how to improve it. Their feedback gave me a better perspective of what people really expect from a modern-day web publishing tool.

          So in last July, I took two weeks off from my day job, to create the initial prototype for a new Punch. Then for the next two months, I spent most of my early mornings and late nights turning this concept into a reality.

          Today, I present you Punch Reloaded, which I deem as a modern web publishing framework.

          Everything is modular

          Problem with the web development is everything change so fast. Devices change. Browsers change. Best Practices change. The tools we use, need to change as well.

          However, this rapid pace of change is not just peculiar to web development. It's been a characteristic of the computing field from its inception. If there was a system, which understood and embraced this concept of rapid change, that was Unix. Unix's followed a modular design, which allowed it to evolve with the time and needs.

          Punch was inspired a lot from this Unix philosophy. The framework is composed of small self-contained modules, that's been written to do only one particular task. You can take apart everything and put together the way you want. You can easily extend the default stack to support any template engine, parser or pre-compiler you want. You can define the way you want to handle the content and layouts. You can even embed Punch within a large application.

          To learn more about the possibilities, refer the sections under Customizing Punch in the Punch Guide.

          Power your site with any JSON backend

          By default, Punch fetches content from JSON files stored in the local file system. However, with the new modular design, you can use any backend that supports JSON, to power your Punch sites. It could be a relational database such as Postgre, a document store such as Mongo, a third-party web service or even new backend service like Firebase.

          Check out this example app, which uses the Twitter API as its backend. I plan to release plugins for several popular backends in the near future. If you are interested in writing your own content handler, please check this article.

          Flexible, inheritable layouts

          With most Content Management Systems you are stuck with a single layout for all pages in a site. But Punch makes it easy to use different layouts for different sections and pages in the site. Unlike in other frameworks, you need not to explicitly define the layout you want to use. Punch will intuitively fetch the matching layout for a page, based on the name and hierarchy.

          Read this article, to learn more about the organization of layouts.

          Spice up the templates with helpers

          Punch offers built-in helpers for common text, list and date/time operations. They are defined in a way that can be used with any template language.

          What's more interesting would be the option to define your own custom helpers. Custom helpers do have the access to request context and can be used to enhance both response header (eg. Setting cookies) and the body. You can learn more details about writing custom helpers from this article.

          Minify and bundle assets

          In these days of mobile web, performance is a key factor for kind of web site. Minifing and bundling of JS/CSS assets are regarded as common best practices to increase the performance of a site. Apart from web app frameworks like Rails, most other site management workflows doesn't adopt such practices.

          However, Punch offers a smart workflow to minify and bundle assets. So you can manage the front-end code, without giving up on the clarity or performance.

          CoffeeScript and LESS out of the box

          With Punch, you can write CoffeeScript and LESS as you would write regular JavaScript and CSS. Make the changes in the editor and just reload the browser to see the changes. Punch will do the compiling for you.

          Check this screencast, to understand how smooth the flow is:

          Easy to get started!

          Punch now comes with a default site structure and a hands-on tutorial to help you get started. Once you get acquainted, you can even use your own boilerplates to create new a site.

          I have also rewritten the Punch Guide, to help you to peruse all the new features.

          Make it your own...

          As I said before, you can extend Punch's default stack by writing plugins. It's possible to write wrappers to use any backend, template engine, parser, pre-compiler and minifier with Punch. Basically, you can use Punch as a glue code, to build your own framework.

          Reading the source is probably the best way to understand what happens under the hood. Also, you can refer the specs to get a better idea on the actual behavior.

          Go, Give it a try and create some kickass sites!

          ]]>
          Simple Helper to Extract Values from a String http://laktek.com/2012/10/04/extract-values-from-a-string http://laktek.com/2012/10/04/extract-values-from-a-string/#comments Wed, 03 Oct 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/10/04/extract-values-from-a-string While writing a blogging engine based on Punch, I needed to implement a way to extract values from a path based on a permalink pattern defined by the user.

          After writing a helper function to solve this case, I realized it can be useful in other similar cases too. So I extracted it to a its own package named, ExtractValues.

          Here's how you can use it:

          var extractValues = require('extract-values');
          
          extractValues('/2012/08/12/test.html', '/{year}/{month}/{day}/{title}.html')
          >> { 'year': '2012', 'month': '08', 'day': '12', 'title': 'test' }
          
          extractValues('John Doe <john@example.com> (http://example.com)', '{name} <{email}> ({url})')
          >> {'name': 'John Doe', 'email': 'john@example.com', 'url': 'http://example.com' }
          
          extractValues('from 4th October  to 10th  October',
                          'from `from` to `to`',
                          { whitespace: 1, delimiters: ['`', '`'] })
          >> {'from': '4th October', 'to': '10th October' }
          
          extractValues('Convert 1500 Grams to Kilograms',
                          'convert {quantity} {from_unit} to {to_unit}',
                          { lowercase: true })
          >> {'quantity': '1500', 'from_unit': 'grams', 'to_unit': 'kilograms' }]

          Options

          From the above examples, you may realize the helper function accepts several options. Let's see what those options mean.

          whitespace - normalizes the whitespace in the input string, so it can be aligned with the given pattern. You can define the number of continuous whitespaces to contain in the string. Making it zero (0) will remove all whitespaces.

          lowercase - converts the input string to lowercase before matching.

          delimiters - You can specify which characters are the value delimiters in the pattern. Default delimiters are { and }.

          This is intended to be used for matching trivial and definite patterns, especially in contexts where you want to give the option to end-users to provide patterns. For more complex and fuzzy matching, you would be better off with regular expressions.

          You can check the source in GitHub or directly install it from the NPM (npm install extract-values).

          P.S.: There were some great suggestions and alternate language implementations in the HackerNews discussion for this post.

          This helper since ported to several other languages:

          ]]>
          My Story of 1996 World Cup http://laktek.com/2012/10/07/how-i-remember-1996-world-cup http://laktek.com/2012/10/07/how-i-remember-1996-world-cup/#comments Sat, 06 Oct 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/10/07/how-i-remember-1996-world-cup Every Sri Lankan you meet today will have only one topic to talk. The T20 World Cup finals. You will hear them saying how Malinga can silence Chris Gayle, what makes Mahela the best captain in the game, why Sri Lanka couldn't win the cup on last 3 occasions and of course, why 1996 world cup victory was so special. No matter how hard you try, 1996 world cup victory will keep popping up in any cricket related conversation with a Sri Lankan. Every one has their own story to share about the 1996 victory.

          Here's mine.

          It was another eventful day in the new school year. We were sitting for the Grade 5 scholarship exam that year, everyone seemed to be more enthusiastic about the studies than before. Just after the interval, some were rushing around teacher's table to get their books corrected, some were fighting in the corner and some of us were just chatting. "BOOM!", suddenly a noise felt like a thunder heard from some distance. We event felt a tremor in our old classroom building. Crows flocked in the near-by banyan tree seemed to be frightened. There was a pin-drop silence in the class, which was nosier than a Sunday market moments ago.

          "That's a bomb." someone murmured. We knew it was close, but had no idea where. Teacher tried to calm down the class and continue with the lesson.

          About in a hour, someone came into our class room to meet the teacher. It was my dad. His shirt looked brownish. His otherwise perfectly combed hair, looked messy. I couldn't hear what he was talking with the teacher. After the conversation, teacher came to me and asked to get ready to go home. I had no idea what's happening. Then a huge feeling of fear entered my mind. My legs started to shiver. "Where's mother??" I asked dad hurriedly walking out of the class. "She's there." He said. For a moment, I thought he was lying, something terrible must have happened. Why he's taking out of the class early?

          Dad wasn't lying. Mom was there down the stairs with my younger brother. She has gone to pick him from the class as well. Her face looked pale. There wasn't any expressions. I still had no idea what's happening. "Why your shirt looks dirty? Did you had an accident??" again I asked dad. "No, I'm alright."

          Only when we returned home and switched on the TV, I realized what has happened. The LTTE have carried out a brutal attack in Fort, the commercial center of Colombo. A lorry filled with explosives was crashed into the Central Bank, leaving only debris of it. The Ceylinco Building, where my dad worked, was burning in flames after being fired with RPGs. Glasses were shattered in the WTC and Bank of Ceylon buildings. Latter is where my mom worked. It said forces were able to hunt down the carders before they launched attacks into those buildings.

          The attack killed 91 people and left thousands permanently injured. It was a miracle that both of my parents survived from one of the most outrageous terrorist attacks in the world. But they seemed to be moving on with the daily chore. Maybe they were too shocked or were just trying to hide the gravity of the incident from us. Later, dad told those were blood stains on his shirt, from carrying the injured people out of the building.

          The fear that entered my mind was still there. I couldn't sleep. I had nightmares. I couldn't erase those images of burnt bodies, people jumping out of flaming buildings and kids moaning over their parents' coffins. That could have been my parents. I feared there will be more attacks. I wanted my parents to stay at home without going to work.

          Only solace for me was going to school. There I was able to be with friends, engage in different activities and forget these fearful thoughts. However, one morning the government announced the schools will be closed indefinitely considering the prevailing security condition in the country. It was rumored the terrorist leader Prabhakaran has proclaimed, he will make thousands of mothers weep in South.

          Being able to stay at home without going to school would have made any 10-year old happy. But it actually made me sad and crazy. Without anything to occupy my mind, now I have to be at home all day with fear and suspense of another bomb and losing my parents.

          Also, on the same day the schools were closed, the Cricket World Cup started. This time it was played in the sub-continent. Jointly hosted by India, Pakistan and Sri Lanka. All the matches were shown on TV. Bored without nothing to do at home, I decided to watch every single match.

          As the tournament started, Australia announced they will not tour Sri Lanka considering the security risks. Soon West Indies followed the same path. It was a major blow for the tournament and more importantly it was a huge disappointment for the Sri Lankan fans, who waited for this historical occasion. At this time, cricketers from India and Pakistan stepped up to form a Friendship XI and play a match in Colombo to support Sri Lanka. Nobody would have believed these arch-rivals could share a one dressing room and hug each other after taking a wicket, until this match. I remember Wasim Akram saying he forgot his kit bag and was wearing one of Azharrudin's shirts for the match. I felt this was the real power and beauty of sports.

          Non of the early setbacks, had any effect on Sri Lankan team's performances. Actually it appeared those incidents have made them stronger. They won all of their group matches comfortably and qualified for the knock outs. In the match against Kenya, Sri Lanka smashed 398 runs in the 50 overs, which was the world record at that time.

          Meanwhile, I was getting fully immersed with the excitement of the world cup. I started writing down the scores of every single match in an exercise book. And even calculated the statistics for each cricketer. When a player's stats were shown on TV, I compare them with my records to see if those would tally. I remember reverse engineering how the batting average and strike rate were calculated. It took me some time to realize that I have to subtract the not outs from the innings batted before taking the average.

          Instead of fearful nightmares of bombs, I was seeing dreams of Sanath playing his square cuts and Aravinda playing his pulls. My morning prayer changed from "God let there be no bombs today" to "God help us to win the match today".

          We reached to the semi-finals of the World Cup. The match was played against India in Eden Gardens, Calcutta. Stadium looked packed. It felt like a battle between 11 Sri Lankans against 100,000 Indians. India won the toss and made Sri Lanka bat first. So far Sri Lanka have won all their matches chasing. They haven't batted first in any of the matches before. Then in the first over itself, Indians sent Sri Lankan openers Sanath and Kalu back to the pavilion. It was the explosives starts the duo produced, which helped Sri Lanka to win all their previous matches. That was a mighty blow. However, then came Aravinda to play one of the best knocks I have ever seen to this date. He scored a brisk 66 runs, but his stroke play was scintillating. Rest of the batsmen built on the foundation laid by Aravinda and took Sri Lanka to a respectable 250 runs.

          Indians were going comfortably in their chase. But all changed when Kalu created a stumping opportunity out of nowhere to Sanath's bowling, which sent Sachin Tendulkar back to the pavilion. It was like a payback for what they missed with the bat. Fall of Sachin, made India crumble. The scoreboard which read 100 for 1 moments ago, now read as 120 for 8. There was no chance for India to win the match from that point.

          Suddenly, the game was halted. It seemed some of the Sri Lankan players were hit with water bottles from the stands. Then we could see people trying to set fire to the stands. The fear started to creep into my mind again. Will they hurt our players?

          As the riots in the stadium got stronger, they decided to end the match and award the victory to Sri Lanka. TV cameras were focused on a person who was holding a banner in the middle of flames, which read "Congratulations Sri Lanka! We are sorry". It was apparent the anger was not against the Sri Lankans, but against their own team. Azhar who was worshiped as an idol few weeks ago, had to have a sad and disgraceful ending to his career.

          There was no better ending to Sri Lanka's fairy-tale journey in the World Cup, than meeting Australia in the finals. It was them who called Murali, a chucker during the boxing day test match. Sri Lankans believed Aussies knew Murali was going to be a threat to the world cricket in the year's to come and wanted to end his career early. Also, Murali was the only Tamil cricketer in the Sri Lankan team and everyone treated him as a national pride. The distaste of Aussies grew further when they forfeited the group matches in Sri Lanka.

          So this looked as the perfect occasion to take the revenge from the Aussies. Unlike Aussies, Sri Lankans didn't chose any bastard tricks. Instead Sri Lankans replied by playing the gentleman's game and totally outclassing Aussies in all departments of the play. Whole city turned into a one big party when Sri Lankan skipper Arjuna scored the winning runs with a boundary.

          We lit firecrackers. I ran around the house.

          Few days after the World Cup, schools were started again. We carried bats and balls to the school and was allowed to play cricket during the P.E. period instead of those boring exercises. I tried to walk down the wicket and hit a six over long-on like Sanath, but only to get bowled by my friend, Prabhath who sent off-spinners like Murali.

          Every one of us had a new dream!

          ]]>
          Embrace the Static Web with Punch http://laktek.com/2012/11/04/embrace-the-static-web-with-punch http://laktek.com/2012/11/04/embrace-the-static-web-with-punch/#comments Sat, 03 Nov 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/11/04/embrace-the-static-web-with-punch This was originally published in the November edition of Appliness digital magazine.

          Remember the very first web sites we created? It was just bunch of HTML files that we uploaded to a remote host using our favorite FTP program. It was all static and it just worked. Yet it was boring. Visitors wanted sites that surprised them every time they hit refresh. Moreover, we also got tired by the slow and cumbersome process we had to follow every time to update the site. Editing content within a HTML tag soup was a chaos. We broke the sites because we forgot to close some nested tag.

          Along with the popularity of the LAMP (Linux, Apache, MySQL and PHP) stack, came the Content Management Systems. They seemed to be the right way to manage a web site and it didn't take long to become the de facto. CMSs allowed us to separate the content from the presentation and update our sites with just couple of mouse clicks. Anyone could run a site, even without knowing HTML.

          However, as our sites grow and starts attracting more traffic, we see the shortcomings of CMSs. They become slow because they render the pages for each request. You need to tune the database and servers to handle the load. To add a trivial new feature to the site you need to modify the internals of the CMS (which is often a spaghetti code). Further,they are full of vulnerabilities. Remember the day your site got hacked, because you missed one of those daily security updates? Managing a web site seems to take up your life and become a chore.

          On times like this, we start to feel nostalgic about the static web sites we created. "It was just bunch of HTML files. But it worked!".

          This inspired me to write Punch, which brings back the simplicity of static web, along with the conveniences of content management. There's no server-side code to run, no databases to configure, no mix up between HTML and the content. You can resort to your favorite tools to create the site locally, preview it in the browser and finally publish the changes to any remote host using a simple command-line interface.

          It's better to understand the concepts of Punch with a simple real-life example. Let's create a site using Punch to share your favorite books with others. We shall call this project as the "Reading List".

          If you are in a hurry, you can check the final result from here and download the source code from GitHub.

          Installing Punch

          Before we start the tutorial, let's install Punch. To run Punch you will need Node.js. Make sure you have installed Node.js (version 0.8+) on your machine. Then open up your Terminal and type:

              npm install -g punch

          This will install Punch as a global package, allowing you to use it as a shell command. Enter the command punch -v to check whether Punch was installed properly. This tutorial was created using Punch version 0.4.17.

          Setup a New Site

          Let's spin off a new project for our Reading List. By running the command punch setup, you can create the project structure with essential files.

              punch setup reading_list

          This will create a directory named reading_list. Inside it we will see another two directories named templates and contents. Also, you will find a file named config.json. You will learn the purpose and role of these directories and files as the tutorial progress.

          While we are inside the project directory, let's start the Punch server by running the command:

              punch s

          This will allow us to preview the site we create in real-time. By default, the server starts on the port 9009.

          Open your browser and enter the URL http://localhost:9009. You should see the welcome screen along with a link to a quick hands-on tutorial. I highly recommend you to take couple of minutes to go through this quick tutorial first, which will help you to grasp the core concepts of Punch. I'll wait till you finish it.

          Intro Sscreen

          Preparing the Layout

          In the quick hands-on tutorial, you learnt Punch uses Mustache as the default templating language. Also, you learnt the layouts, partials and static assets that composes a site's user interface must be saved inside the templates directory.

          Make sure you removed the {{{first_run}}} from the templates/_footer.mustache to get a clean view sans the hands-on tutorial.

          Now let's turn our focus back to the Reading List page we are creating. It should contain the following information:

          • Introduction
          • List of Books (we must provide the following information for each book)
            • Title
            • Cover image
            • Author
            • ISBN
            • Your rating
            • Favorite quote from the book
            • Link to the book in Amazon

          We only need to create a single web page to show these information. So we can directly customize the default layout (templates/_layout.mustache) to create the view we need.

          {{> header }}
          
                  <div role="main">
                      <p>{{{intro}}}</p>
                      <div id="books">
                          {{#books_list}}
                              <div class="book">
                                  <h3><a href="{{amazon_link}}">{{{title}}}</a></h3>
                                  <div class="cover">
                                      <a href="{{amazon_link}}"><img src="{{cover_image}}"></a>
                                  </div>
                                  <ul>
                                      <li><b>Author</b> - {{author}}</li>
                                      <li><b>ISBN</b> - {{isbn}}</li>
                                      <li><b>Rating</b> - {{rating}}</li>
                                      <li><b>Favorite Quote</b> - {{{favorite_quote}}}</li>
                                  </ul>
                              </div>
                          {{/books_list}}    
                      </div>    
                  </div>
          
          {{> footer }}

          Note that some Mustache tags are surrounded with three curly-braces, while others are surrounded with two curly-braces. By having three curly-braces we say Mustache not to escape the HTML within the content. In places where you want to have HTML formatted content, you must use the tags with three curly-braces.

          After modifying the layout, refresh the page in the browser to see the changes.

          Creating the Reading List in JSON

          Still you won't see any visual difference in the page. But if you view the source of the page, you will see the HTML structure you defined with empty values in the places where there were Mustache tags. We must provide content to render into those tags.

          Let's start with the most essential piece of content of the page - list of books. Open the contents/index.json and start entering the details of your favorite books in the following format.

          {
              "book_list": [
                  {
                      "title": "The E-Myth Revisited",
                      "amazon_link": "http://www.amazon.com/gp/product/0887307280",
                      "cover_image": "http://ecx.images-amazon.com/images/I/41ieA7d6CYL._SL160_.jpg",
                      "author": "Michael E. Gerber",
                      "isbn": "0887307280",
                      "rating": "10/10",
                      "favorite_quote": "\"The true product of a business is the business itself\""
                  }
              ]
          }

          We've defined a JSON array named book_list which contains multiple book objects. For each book object, we define the required details as properties.

          Save the file after entering the books and refresh the browser. You should now see the book list you just created rendered into the page as HTML.

          You can continue to add more books or update the existing entries in the contents/index.json. The page will be rendered every time you make a change in the content.

          Writing the Introduction Text using Markdown

          So now we have listed our favorite books, let's add a simple introduction to the page. Rather than defining it as a JSON string, you can use Markdown formatting to write this piece.

          When fetching contents for a page, Punch will look for extended contents such as Markdown formatted texts, in a directory by the name of the page prefixed with an underscore. This directory must be placed inside the contents directory along with the JSON file for the page.

          In this instance, we should create a directory named _index and save our introduction inside it as intro.markdown. The filename of an extended content should be the tag name you wish to use in the templates to retrieve that content.

          Changing the Site's Title

          You will notice site's title is still displayed as "Welcome". Let's change that too. Site-wide content such as the site's title, navigation items are defined in the contents/shared.json file. Open it and change the site's title to "Reading List".

          Styling with LESS

          Now we are done preparing the content, let's do some style changes to beautify the page. You can use LESS to write the styles and Punch will automatically convert them into regular CSS.

          As I mentioned previously, all static assets such as stylesheets, JavaScript files and images must be stored in the templates directory. You can organize them in any way you like inside the templates directory.

          You will find the default site specific styles in templates/css/site.less. Let's change them to achieve the design we want for Reading List. To keep this tutorial concise, I won't show the modified stylesheet here. You can check it from the project's repo on GitHub:

          Similar to processing the LESS files, Punch can also pre-compile CoffeeScript files into JavaScript automatically.

          Minifying and Bundling CSS/JavaScript Assets

          Minifying and bundling of CSS & JavaScript assets are recommended performance optimizations for all kinds of web sites. Those help to reduce the number of round-trips browsers needs to make in order to fetch the required assets and also minimizes the size of the assets that needs to be downloaded.

          Minifying and bundling assets in a Punch based project is fairly straightforward. You only have to define your bundles inside the config.json. Then Punch will prepare and serve the minified bundles at the time of generating the site.

          We can bundle the CSS files used in the project like this:

          "bundles": {
              "/css/all.css": [
                  "/css/normalize.css",    
                  "/css/main.css",
                  "/css/site.less"
              ]    
          }

          Then, you can use Punch's bundle helper tags to call defined bundles inside the templates.

          <head>
              <!-- snip -->
              {{#stylesheet_bundle}}/css/all.css{{/stylesheet_bundle}} 
          </head>

          This will produce a fingerprinted stylesheet tag (useful for caching) like this:

          <head>
                  <!-- snip -->
                  <link rel="stylesheet" type="text/css" media="screen" href="/css/all-1351313179000.css"> 
          </head>

          Similar to stylesheet_bundle tag, there's a javascript_bundle tag which you can use to call JavaScript bundles from a page.

          Publishing to S3

          Our Reading List page is now almost complete.

          Finished Site

          Let's share it with others by publishing on the web. Punch allows you to either publish your site directly to Amazon S3 or upload it to a preferred host using SFTP.

          In this tutorial, we will publish the site to Amazon S3. You will have to signup with Amazon Web Services and enable the S3 storage service for your account. Then, create a S3 bucket and a user who has access to bucket.

          Enter those settings in the config.json file of your Punch project under the section name publish.

              "publish" : {
                  "strategy" : "s3", 
                  "options" : {
                      "bucket" : "BUCKET",
                      "key" : "KEY",
                      "secret" : "SECRET",
                      "x-amz-acl": "public-read"
                  }
              }

          Then on the terminal, run the following command:

              punch p

          This will publish the Reading List site you just created to S3. Point your browser to the URL of your S3 bucket and you should see the site.

          You can check the sample site I created by visiting this URL: http://readlist.s3-website-us-east-1.amazonaws.com

          In future, if you want to add a new book or update an existing entry, all you need to do is edit the list in contents/index.json and then run the command publish p. It will publish only the modified files from the last update.

          Extending Punch

          In this tutorial, I covered the basic workflow for creating and publishing a site with Punch. You can easily extend and customize it further based on your requirement.

          For example, you can implement your own content handler to enable sorting of the book list by different criteria. Also, if you prefer to use a different templating language than Mustache, you can write your own template engine wrapper to Punch (check out the implementation of Handlebars engine).

          If you're interested in learning more about the available features in Punch and how to extend them, you can refer the Punch Guide.

          ]]>
          Punch based Boilerplate to Power Your Blog http://laktek.com/2012/11/26/a-fast-intuitive-blogging-tool-based-on-punch http://laktek.com/2012/11/26/a-fast-intuitive-blogging-tool-based-on-punch/#comments Sun, 25 Nov 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/11/26/a-fast-intuitive-blogging-tool-based-on-punch When I first created Punch, I didn't intend it to be used as a blogging tool. This blog was running on Jekyll during that time and I was content with the setup. However, as I was switching back and forth between Punch based projects and this Jekyll based blog, it became apparent how Jekyll's workflow misses the simplicity and intuitiveness I have in Punch. So I wanted a Punch based extension for blogging.

          I wrote a special blog specific content handler which can be added to any Punch based site. It also preserves lot of Jekyll's conventions such as the post structure and permalinks. This makes transition of a blog from Jekyll to Punch really smooth.

          Last week, I switched this blog from Jekyll to a Punch based setup. I'm happy how it turned out. Workflow of creating, previewing and publishing a post is now much faster and intuitive. You can check the source code of the project to get an idea how it is structured.

          Since lot of you were asking about the possibility of using Punch to power a blog, I thought it would be useful to share my workflow with you. So I decided to release a boilerplate based on my setup, which you can use to create your own blog.

          Intro Sscreen
          Screenshot of Punch Blog Boilerplate

          Here are some of the cool features available in this boilerplate:

          • Preview posts, as you write them.
          • Easily publish to Amazon S3.
          • Pretty URLs for permalinks (no .html, configurable).
          • Responsive, customizable theme based on HTML5Boilerplate & 320andup framework.
          • Load fonts from multiple sources with WebFonts Loader.
          • Easily configure Google Analytics, Tweet button & Disqus comments.
          • Highlighting the current page link.
          • Post archives by tags.
          • Post archives by year, month or date.
          • Write posts using GitHub flavored Markdown.
          • Client-side code highlighting with Prism.js
          • Published/draft states.
          • Automatically minifies and bundles JavaScript/CSS.
          • RSS feed
          • Sitemap.xml

          Also, you can use any other features available in Punch.

          • Manage other pages with Punch's default content handler.
          • Extend the behavior by writing your own helpers.

          You can download the boilerplate from GitHub. After downloading it, follow the setup instructions specified in the README. It's very easy to customize and extend.

          Feel free, to open a ticket in GitHub issues if you run into any bugs.

          ]]>
          JSCamp Asia http://laktek.com/2012/12/04/jscamp-asia http://laktek.com/2012/12/04/jscamp-asia/#comments Mon, 03 Dec 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/12/04/jscamp-asia Singapore city view at night

          Last week I was in Singapore, to attend JSCamp.asia. It was the first full-scale JavaScript conference to be staged in this region and was nothing less of the vibrant, insane spirit you expect from a JavaScript conference.

          JSCamp happened to be the first international conference I was selected to give a talk. My talk was titled "Embracing the Static Web". It touched on the challenges we face as web developers in this region and the need of creating tools to overcome these challenges, while creating modern web experiences. In the latter part of the talk, I explained how these needs motivated me to create Punch and shared how it fits the bill as a modern web framework. I received lot of encouraging feedback and remarks after the talk. But I believe I could have done it better and this experience will immensely help when I prepare for future talks.

          Here's the slide deck for the talk.

          Update: Video of the talk is now online.

          JSCamp gave me the opportunity to meet and hangout with many interesting and respected personalities in the JS (and web tech) community. Seriously, they are so smart and passionate in their game. Also, I couldn't ignore the fact how genuine and honest they were, when it comes to sharing their knowledge and experiences. Jed, Divya, Angus, Jan, Michal, Tim, Alex, Tomasz, Eric and John/Zackery of GitHub, you guys are awesome!

          Also, I got to meet lot of developers based in Singapore and others who came specifically for JSCamp from countries like Thailand, Indonesia, Vietnam, South Korea and Philippines. They seems do lot of interesting stuff and I gathered lot of insights by talking to them. Most of them have followed traditional educational paths (which is common in Asia) and now eagerly switching their gears to expand their knowledge into web technologies.

          Overall, I feel there would be a huge bloom in the web tech industry and related startups in this region in the coming years. Especially, Singapore seems to got the right kind of essence to ignite such a trend. However, it's still an infant when compared to eco-system in Silicon Valley. But with more experiences and inspiration, I'm sure things will start to change rapidly.

          That's why we need more events like JSCamp in this region. I expect to see others also taking this challenge as Thomas and Sayanee did. Great job guys!

          ]]>
          End of a Chapter http://laktek.com/2012/12/24/end-of-a-chapter http://laktek.com/2012/12/24/end-of-a-chapter/#comments Sun, 23 Dec 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/12/24/end-of-a-chapter Thank you Vesess!

          Yesterday was officially my final day with Vesess Team.

          It was 2006. I was just out of college - spending most of the day lurking on local internet user groups. If my memory serves right, it was from this random discussion thread, I first stumbled upon Prabhath Sirisena and his blog, Nidahas (unfortunately dead now). Nidahas, aroused my curiosity in topics such as Web Standards, Open Source Software and Startup Culture. It also inspired me to create my own blog.

          Couple of months later, I was selected to the IT faculty of Univeristy of Moratuwa. Prabhath was also a final year student at the same faculty. Though, I always wanted to meet him in person - it didn't happen until this sensational IM conversation:

          Prabhath: hi lakshan, free for a chat?

          me: ya sure

          Prabhath: ok, let me be frank and get on with it you know about Vesess, i guess?

          me: whats up? ya your company ni

          Prabhath: right, so we're on the lookout for young prospects. Could you tell me your current "availability" so to speak, so i can do my dirty pitch?

          me: i wud luv to join with vesses

          That decision was no brainer. Vesess had already built-up a reputation for its quality work and seemed they are aiming at a different level compared to other Sri Lankan web design agencies. I knew this would be the perfect place for a freshman like me to groom my career.

          Few days later, I went to a nice little office in Horton Place. I was welcomed by two lanky pleasant looking guys, who seemed to be in their early twenties. They happened to be the two co-founders of Vesess. That was my first interaction with Lankitha and Prabhath in meatspace.

          Part of orientation at Vesess, involves answering a questionnaire called "Think First". It's a guide which both yourself and the company can use to asses your progress in the long run. One of the questions in it was:

          How would you like to describe yourself in 5 years from now? Can you explain this with targets?

          This was my answer to it:

          I would be a team leader of a small start-up that will make huge impact on the society, by developing products that would simplify the lives of many.

          Back then, I had no idea how realistic that dream was. But today, 6 years down the line I can proudly say I have achieved that exact goal with CurdBee.

          Beginning of 2008, Lankitha wanted to diversify the company from being solely a web design agency to a product company. CurdBee, was our first attempt in reaching this goal. I was released from client projects to fully focus on the development of CurdBee.

          There were lot of unknowns when we started. How to monetize the product? How to get a payment gateway and a merchant account from Sri Lanka? How to market the product? How to get popular tech blogs to cover us? How to provide support? How to scale the application? How to make it secure? How to retain the users? How to find talent?

          During the last 4 years, we worked hard to find answers for these questions. Current state of CurdBee, proves we have answered them in the best way we could. It wasn't a overnight success. We had many arguments, sleepless nights, moments of panics, tears and bunch of small victories along the way.

          Once you’re successfully past 1.0, you have a choice: coast and die, or disrupt. No one in history has ever actually chosen coast and die; everyone thinks they’re choosing the path of continued disruption, but it’s a very different choice when it’s made by a Stable than by a Volatile. A Stable’s choice of disruption is within the context of the last war. They can certainly innovate, but they will attempt to do so within the box they bled to build. A second-generation Volatile will grin mischievously and remind you, “There is no box.”

          This beautiful essay by Rands is a perfect summary to explain my decision to move away from Vesess. I still have that desire to disrupt, but I believe the time has come for me to pick a new battle. I love getting into adventures full of challenges and uncertainties. Those are the environments that brings the best in me. I prefer being a volatile than a stable. You will soon get to hear more about my next moves.

          The army I created at Vesess, is now ready for the limelight. There's a perfect blend of volatiles and stables, who could continue to disrupt. I'm sure they will take CurdBee to the next level and innovate better than in my reign.

          I'm forever grateful for Lankitha for believing in me and helping to reach my dreams. He always preferred to remain behind the scene, setting the stage for us to shine. You are a great leader, friend and a brother. Prabhath, Mahangu, Laknath, Sameera, Amila, Lahiru, Asantha, Asanka, Anushke, Ramindu and Mahinda; it was a pleasure working with each one of you and I enjoyed every moment of it.

          Thank you Vesess for everything!

          ]]>
          Revisiting JavaScript Objects http://laktek.com/2012/12/29/revisiting-javascript-objects http://laktek.com/2012/12/29/revisiting-javascript-objects/#comments Fri, 28 Dec 2012 16:00:00 GMT Lakshan Perera http://laktek.com/2012/12/29/revisiting-javascript-objects During the holidays, I spent some time catching up on the developments in ES6 (next version of JavaScript). While going through some of the proposals such as Minimal Class Definitions, Proxy API and Weak Maps; I noticed most of these enhancements make extensive use of the object manipulation features introduced in ES5 (ie. ECMAScript5 - the current JavaScript standard).

          One of the main focuses of ES5, has been to improve the JavaScript's object structure and manipulation. The features it introduced do make lot of sense, especially if you're working with large and complex applications.

          We've been little reluctant to adopt ES5 features, especially due to browser compatibility issues. We rarely see production code that make use of these features. However, all modern browsers (ie. IE9, FF4, Opera 12 & Chrome) do have JavaScript engines that implement the ES5 standard. Also, ES5 features can be used in Node.js based projects without any issue. So I think it would be a worthwhile exercise to revisit the ES5 object features and see how they can be used in real-life scenarios.

          Data and Accessor Properties

          ES5 introduces two kinds of object properties - data and accessors. A data property directly maps a name to a value (eg. integer, string, boolean, array, object or a function). An accessor property maps a name to a defined getter and/or setter function.

          var square = {
              length: 10,
              get area() { return this.length * this.length },
              set area(val) { this.length = Math.sqrt(val) }
          }

          Here we have defined a square object, with length as a data property and area as an accessor property.

          > square.length
            10
          > square.area
            100
          > square.area = 400
            400 
          > square.length
            20

          When we access the area property, its getter will calculate and return the value in terms of the length property. Also, when we assign a value to area, its setter function will change the length property.

          Property Descriptor

          ES5 allows you to have more fine-grained control over the properties defined in an object. There's a special attribute collection associated with each property, known as the property descriptor.

          You can check the attributes associated to a property by calling the Object.getOwnPropertyDescriptor method.

          > Object.getOwnPropertyDescriptor(square, "length")
          {
              configurable: true
              enumerable: true
              value: 20
              writable: true
          }
          
          > Object.getOwnPropertyDescriptor(square, "area")
          {
              configurable: true
              enumerable: true
              get: function area() { return this.length * this.length }
              set: function area(val) { this.length = Math.sqrt(val) }
          }

          As you can see from the above two examples - value and writeable attributes are only defined for data property descriptors, while get and/or set are defined for accessor property descriptors. Both configurable and enumerable attributes applies to any kind of property descriptor.

          The writable attribute specify whether a value can be assigned to a property. If writable is false, property becomes read-only. As the name implies, configurable specifies whether the property's attributes are configurable and also whether the property can be deleted from the object (using the delete operation). The enumerable attribute determines whether the property should be visible in for..in loops or Object.keys methods.

          We can modify these attributes in the property descriptor by using the Object.defineProperty method.

          Object.defineProperty(square, "length", {
              value: 10,
              writable: false
          });

          This will make the length property in square read-only and permanently set to 10.

          > square.length
            10
          > square.area = 400
            400
          > square.length
            10
          > square.area
            100

          Tamper-proof Objects

          On some instances, you need to preserve the objects in its current state during the run-time without any further extensions or modifications to the properties. ES5 provides three levels of controls that you can apply to the objects.

          Calling preventExtensions method will make the object non-extensible. This means no further properties can be defined for the object.

          > Object.preventExtensions(square);
          
          > Object.defineProperty(square, "text", { value: "hello" });
            TypeError: Cannot define property:text, object is not extensible.

          Sealing the object, will prevent both defining of new properties and the deletion of existing properties in the object.

          > Object.seal(square);
          
          > delete square.length
            false

          If we go one step further and freeze the object, it will also disallow changing the existing property values in the object. At this point, whole object effectively becomes a constant.

          > Object.freeze(square);
          
          > square.length = 20
            20 
          > square.length
            10

          You can use the methods Object.isSealed, Object.isFrozen and Object.isExtensible to programmatically check the state of an object.

          Even though an object is protected, it would still be possible to extend its prototype. Check the following example:

          var obj = Object.create({}, { onlyProp: { value: true } });
          Object.preventExtensions(obj);
          
          var proto = Object.getPrototypeOf(obj);
          proto.anotherProp = true;
          
          > obj.anotherProp
            true

          Enumerations

          Often, we use JavaScript objects as associative arrays or collections. On such instances, we are tempted to use for...in loops to enumerate over the properties. However, the loop will step through all enumerable properties available in object's prototypal chain, resulting with undesired outcomes.

          To avoid such side effects, JSLint suggests to manually check whether the given property is defined in the object.

          for (name in object) { if (object.hasOwnProperty(name)) { .... } }

          ES5 provides Object.keys method, which would return an array of own enumerable properties of an object.

          We can use this method to safely iterate over a property list:

          Object.keys(obj).forEach( function(key) {
              console.log(key);
          });

          Note: Array.forEach is also a new feature introduced in ES5

          Inheritance

          We know JavaScript provides behavior reuse in terms of prototypal inheritance. However, lack of direct mechanism to create a new object using another object as a prototype, has been one of pet peeves in the language.

          The standard way to create a new object is to use a constructor function. This way, the newly created object will inherit the prototype of the constructor function.

          var Person = function(first_name, last_name) {
              this.first_name = first_name;
              this.last_name = last_name;
          }
          
          Person.prototype = {
              say: function(msg) {
                  return this.first_name + " says " + msg;
              }
          }
          
          var ron = new Person("Ron", "Swanson");

          If someone calls the constructor function without the new operator, it could lead to unwarranted side-effects during the execution. Also, there's no semantical relationship between the constructor function and its prototype, which could cause confusions when trying to comprehend the code.

          For those who prefer to have a alternate syntax, ES5 provides the Object.create method. It takes a prototype object and a property descriptor as arguments.

          Here's an alternate implementation that can be used to create Person objects, using the Object.create and module pattern.

          var Person = (function() {
          
              var proto = {
                  say: function(msg) {
                      return this.first_name + " says " + msg;
                  }
              }
          
              return {
                  init: function(first_name, last_name) {
                      return Object.create(proto, {
                          first_name: { value: first_name, enumerable: true },
                          last_name: { value: last_name, enumerable: true }
                      });
                  }
              }
          
          })();
          
          var ron = Person.init("Ron", "Swanson");

          However, compared to constructor functions using Object.create could be considerably slow. So choose which implementation you want to use depending on the context and requirements.

          Even if you use prefer to use constructor functions, Object.create will come in handy when you want to have multiple levels of inheritance.

          var Person = function(first_name, last_name) {
              this.first_name = first_name;
              this.last_name = last_name;
          };
          
          Person.prototype = {
              say: function(msg) {
                  return this.first_name + " says " + msg;
              }
          };
          
          var Employee = function(first_name, last_name) {
              Person.call(this, first_name, last_name);
          }
          
          Employee.prototype = Object.create(Person.prototype, {
              department: { value: "", enumerable: true },
              designation:{ value: "", enumerable: true }
          });
          
          var ron = new Employee("Ron", "Swanson");

          We've extended the Person prototype to create the prototype of Employee.

          Cloning Objects

          Finally, let's see how to create a shallow clone of an object using ES5's object methods.

          var clone = function(obj) {
              // create clone using given object's prototype
              var cloned_obj = Object.create(Object.getPrototypeOf(obj)); 
          
              // copy all properties
              var props = Object.getOwnPropertyNames(obj);
              props.forEach(function(prop) {
                  var propDescriptor = Object.getOwnPropertyDescriptor(obj, prop);            
                  Object.defineProperty(cloned_obj, prop, propDescriptor);
              });
          
              return cloned_obj;
          }

          Here, we retrieve the prototype of the given object and using it to create the clone. Then we traverse all properties defined in the object (including the non-enumerable properties) and copy their property descriptors to the clone.

          Further Reading

          If you're interested in learning more about the JavaScript objects and how to manipulate them, I would recommend you to peruse the following resources:

          ]]>
          The Next Generation is Here! http://laktek.com/2013/01/06/the-next-generation-is-here http://laktek.com/2013/01/06/the-next-generation-is-here/#comments Sat, 05 Jan 2013 16:00:00 GMT Lakshan Perera http://laktek.com/2013/01/06/the-next-generation-is-here On January 1st, from the usual flow of new year wishes in my inbox, a special message stood out:

          Hi Aya did you use Html for coding websites ? i'm also coding websites but with Php

          Where did you host ur website ? did u host on ur own sever?

          It was from one of my cousins. I haven't seen him in 8 years. He was just a toddler when he migrated to France with his parents in 2004 just after the Tsunami. He's becoming a teenager this year. And he wants to make websites.

          I created my first web site when I was 14. Back in the day, creating and publishing a personal homepage on Geocities was considered the coolest thing you can do on the internet as a kid. My uncle (his dad) boasts he will soon catch me up. But he lives in an era, you can get 15,000 random people to like you by just saying "hey". He can take a picture of his food, apply a Polaroid filter and publish it on the web in the matter of seconds.

          Then why the hell he wants to create his own website?

          For long, we debated what could be the Facebook killer. The answer is here. No, it will not be another Silicon Valley startup. It is the post-millennials - the generation my cousin belongs to. They will soon revolt against the wall-gardened social web. They will want to take the web back to its roots.

          What can we do, as the veterans from the previous generation, to fuel this revolution? Should they be using a doubled-clawed hammer like PHP?. Should they be forced to stay up all night figuring out how to configure Apache and MySQL for their environment? Should they still be reading tutorials that includes hacks to circumvent the limitations of IE6 (or even know what IE6 is)?

          They grew up experiencing vastly better systems. We've spoiled them with tools that are intuitive and fun to use. The web they know is so easy to consume. Can we also make it a fun platform to create? A platform which will help them to innovate better and move faster?

          They deserve tools that can widen their creativity, rich resources that can help them to learn better and of course, guidance on h