Skip to content
Nov 18 / Nizam Sayeed

Returning HTTP responses with django-tastypie

Here at MutualMind, we’ve built our REST developer API using the excellent django-tastypie framework. Once you understand the basic request/response cycle as mentioned in the documentation, it takes hardly any time get a full featured REST API up and running. However, the documentation is missing one piece of information: What is the proper way to return various HTTP responses?

Not to worry. After taking a deep dive into the framework’s code, I’ve discovered this well kept secret. There are two pieces to this puzzle. One is a class found in tastypie.exceptions called ImmediateHttpResponse. And the other is a set of HTTP response classes found in tastypie.http. These classes include (but are not limited to):

  • HttpCreated: HTTP 201
  • HttpBadRequest: HTTP 400
  • HttpForbidden: HTTP 403
  • HttpNotFound: HTTP 404
  • HttpApplicationError: HTTP 500

So let’s see an example. Let’s say that you need to stop request processing and send a response back to the client saying that the resource it is trying to access is forbidden. First you would import the appropriate classes:

from tastypie.exceptions import ImmediateHttpResponse
from tastypie.http import HttpForbidden

In your API code, you could then raise an ImmediateHttpResponse and pass in the HttpForbidden object to the constructor:

raise ImmediateHttpResponse(
  HttpForbidden("Coleman?! There is no Coleman here.")

You can substitute the response class with any of the others found in tastypie.http. Have fun baking tasty APIs with django-tastypie.

UPDATE (Nov 25, 2011 @ 4:33AM) Daniel Lindley, the author of django-tastypie pointed me to an alternative way to return an HTTP response. Simply invoke the create_response function of your Resource or ModelResource class. Just pass in the HTTP response class to it using the response_class keyword argument like so:

return self.create_response(
    request, bundle,
    response_class = HttpForbidden
Sep 24 / Nizam Sayeed

Crouching Dallas, Hidden Startups

Bradley Joyce over at 3#Labs wrote a nice piece about a problem facing the startup scene in the Dallas-area: We have have a lot of tech talent in Dallas, just not startup talent.

I’d like to point out another problem in our area. We have a bunch of startups that operate under the radar and don’t even engage with the local startup scene in any way!

I know because I used to work for just such a startup. They are a well funded startup (still alive and kicking) in the financial services business. They are funded by the likes of Idealab, Kleiner Perkins, Flybridge and many others. I think they had raised a series E round the last time I heard.

I worked for these guys from 2004 – 2008. During that time, I didn’t see or hear the founders engaging with the local startup community in any shape or form. In fact, the company wasn’t run like a startup at all. It was run more like a big corporation. We had a dress code, a lot of internal politics, top heavy management and much more. All the hallmarks of a bureaucratic company.

I’m sure these guys aren’t the only ones in Dallas who have quietly started a company, gotten a lot of funding and have even made a successful exit. All under the radar!

It really goes back to the culture of the founders at these startups. If they came from a corporate background, which is very likely if they are from the DFW area, then they may run their startups in the same way.

If they don’t have a culture of sharing and giving back – the pay it forward culture – as they say, then the entire scene suffers as a result.

Aug 15 / Nizam Sayeed

Enabling color directory listings in Mac OS X Terminal

Add these two lines to your ~/.bash_profile to enable color in OS X Terminal:

export CLICOLOR=1
export LSCOLORS=ExFxCxDxBxegedabagacad

For more reference:
ls(1) Mac OS X Manual Page

Apr 19 / Nizam Sayeed

Suppressing MySQL/MySQLdb warning messages from Python

Here is a quick three line snippet that will suppress all of those annoying warning messages from MySQL when using MySQLdb:

from warnings import filterwarnings
import MySQLdb as Database
filterwarnings('ignore', category = Database.Warning)

To re-enable the warnings later on:

from warnings import resetwarnings

More about the warnings Python standard library:

Nov 26 / Nizam Sayeed

Splitting up Django models

Sometimes it makes sense to split up your Django models for a specific application across multiple files instead of having all of them in one file. This allows for easier and simpler code organization and maintenance.

Splitting up your models in Django is fairly simple, although it requires a few extra steps. In our example, we’ll create two Django models, Foo and Bar, each defined in their own files within an app called myapp.

Let’s also assume that Bar has a ForeignKey to Foo.

Create a models Python module within the app. The application directory structure may look something like:

  • /myapp
    • /models

Here are the contents of

from django.db import models

class Foo(models.Model):
    foo_text = models.CharField()

    class Meta:
        app_label = 'myapp'


from django.db import models
from import Foo

class Bar(models.Model):
    foo = models.ForeignKey(Foo)
    bar_text = models.CharField()

    class Meta:
        app_label = 'myapp'

Notice the definition of the app_label property in the inner Meta classes for each model. This is very important to let Django’s syncdb command know that these split up model classes belong to the same application.

We’re not done yet. You’ll also need to explicitly import each model class in the model module’s file:

from import Foo
from import Bar

And that’s it. Run syncdb and you should be all set.

NOTE: One thing to note about ForeignKey relationships is that the import order in is very important. Since Bar has a ForeignKey to Foo, Foo must be imported before Bar.

NOTE: If you are splitting up the contents of an existing file, make sure to delete the original file when you are done otherwise syncdb may get confused.

UPDATE November 28, 2009 @ 3:30PM) Pedro Costa pointed out an existing issue with Django (ticket #6961) and loading fixtures for split up models. One solution is to just put the fixtures under the new /models sub-dir since Django is already looking for them there. However, this may break once they do fix the bug in the Django code base. Or you could try Justin Lillly’s approach which was mentioned on his blog.

UPDATE November 29, 2009 @ 10:04AM) I was asked to illustrate how to use a ForeignKey between these models. I have modified my post to show an example of how to do that.

Nov 7 / Nizam Sayeed

Setting up your own Hadoop cluster with Cloudera distro for EC2

Installing Prerequisites

Before you begin, figure out what distro you are on (if you don’t already know) by issuing this command from the shell:

lsb_release -c

For my set up, I used Ubuntu 8.04.3 LTS Hardy Heron.

Add Cloudera repositories for your distro to apt sources. Create a file called /etc/apt/sources.list.d/cloudera.list and add the following two lines in it:

deb [distro]-stable contrib
deb-src [distro]-stable contrib

Add repository GPG key to your keyring:

curl -s | sudo apt-key add -

Update apt:

sudo apt-get update

Install Hadoop:

sudo apt-get install hadoop

Install boto and simplejson for Python (I used pip but you could use easy_install as well):

sudo pip install boto simplejson

Setting Up Cloudera Client Scripts

Download the Cloudera client scripts:


The location of the scripts may change and the only link to the scripts can be found on this page:

Once downloaded, extract the archive’s contents to /opt/cloudera. You could put it anywhere else you like.

Modify your environment variables and add the following to your .bash_profile:

export AWS_ACCESS_KEY_ID=[your AWS access key ID]
export AWS_SECRET_ACCESS_KEY=[your AWS secret access key]
export PATH=$PATH:/opt/cloudera

The Cloudera client scripts need to have access to your AWS credentials from the system environment in order to work properly. Adding the path to where the the scripts reside will allow you to call the hadoop-ec2 command that comes bundled with the package.

Source your .bash_profile to update your environment with the changes.

source ~/.bash_profile

Create your .hadoop-ec2 config directory in your home directory:

mkdir .hadoop-ec2

Create a file called ec2-clusters.cfg in your .hadoop-ec2 directory with the following contents:

key_name=[your EC2 key name]
ssh_options=-i %(private_key)s -o StrictHostKeyChecking=no

You could have multiple blocks for each of your clusters defined in this file. We just created on block for a cluster named my-hadoop-cluster.

Test that the Cloudera client scripts are working:

hadoop-ec2 list

If everything is configured properly up to this point, you should get output that says:

No running clusters

Launch your cluster (in this case with a master and one slave node):

hadoop-ec2 launch-cluster my-hadoop-cluster 1

Set up a proxy to the cluster:

hadoop-ec2 proxy my-hadoop-cluster

Setting Up Pig

The version of Pig that is a part of the Cloudera apt repositories was a little out of date for me so I decided to install a more updated version manually. The version of Hadoop that runs on the cluster is 0.18.3 and the latest version of Pig that is compatible with that Hadoop version is 0.4.0. You can download it from here:


If the link above doesn’t work, then try any of the mirrors for Pig listed on this page:

Extract the contents of the archive to /opt/pig. Again, you could put it anywhere you desire.

Add the following to your .bash_profile:

export HADOOP_CONF_DIR=~/.hadoop-ec2/my-hadoop-cluster
export HADOOP_HOME=/usr/lib/hadoop
export JAVA_HOME=/usr/lib/jvm/java-6-sun
export PIG_CLASSPATH=/opt/pig/pig-0.4.0-core.jar:$HADOOP_CONF_DIR
export PIG_HADOOP_VERSION=0.18.3

The HADOOP_CONF_DIR environment variable should point to the location of your cluster’s hadoop-site.xml file. If you are going to be running multiple clusters, you may want to create some shell scripts that set the appropriate environment variables for each cluster before running any commands like Pig, which depend on those values.

Don’t forget to add the path to the Pig bin directory to your path as well:

export PATH=$PATH:/opt/cloudera:/opt/pig/bin

Source your .bash_profile to update your environment with the changes and then run the pig command:


It should launch Pig in mapreduce mode by default. In this mode, the grunt shell will be connected to your EC2 Hadoop cluster vs. your local Hadoop installation. You should see something like the following:

2009-11-07 15:02:15,930 [main] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: hdfs://
2009-11-07 15:02:16,653 [main] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to map-reduce job tracker at:

Congratulations, you now have a fully functioning Hadoop cluster running on EC2 with Pig ready to do your bidding.

Nov 2 / Nizam Sayeed

rxvt – A better console for Cygwin

I have been using xterm on Cygwin/X for a long time now. I love the fact that you can have a native X Windows server running on Windows with xterm and all the X11 goodies. However, since upgrading to Windows 7, I have had a lot of problems with getting both X and xterm to work properly.

I was somehow able to cobble some stuff together to make it all work but then… disaster! I found out that copying text from Windows and pasting it into xterm (specifically when I am connected to a remote host via SSH) kills my network connection for a few seconds! It happens every time and I haven’t found a single solution to this problem on the web after Googling for it for months.

After a lot of hair pulling, I discovered that Cygwin has a port of rxvt. So I installed it and it is working perfectly for me. I created a shortcut on my desktop with this as the target:

C:\Cygwin\bin\rxvt.exe -display :0 -fn "Consolas-20" -tn rxvt -e /usr/bin/bash --login -i

Copy/paste is working again and everything is as it should be.

Oct 31 / Nizam Sayeed

Designing user interfaces with Balsamiq Mockups For Desktop

As a web developer, it can be quite time consuming to meticulously code complex pages with HTML and CSS. What’s worse is to invest a lot of time doing that and then realizing that your precious layout just doesn’t work from a UX stand point. Sometimes you have multiple ideas for a layout and are torn between which one to go with.

There are many software tools out there which allow you to create wireframe mockups/sketches of user interfaces. Some of these tools can be quite expensive and for a startup that is bootstrapping, totally out of reach. Well, not to worry. I recently got my hands on a very inexpensive tool called Balsamiq Mockups For Desktop which really increased my productivity as a UX designer (one of the many hats I have to wear currently).

Mockups For Desktop is based on the Adobe AIR platform. Once you install the AIR runtime on your PC, installing AIR-based applications is a breeze.

Mockups For Desktop is an absolute bargain at $79 a license. The user interface is really intuitive and I was able to dive in and create my first mockup in about an hour. The interface sports a toolbar at the top with all your UI elements. Balsamiq has done a superb job of creating a toolset which is comprised of the most common UI elements you are likely to need in designing your mockups. You can simply drag and drop any of these elements onto the grid-based canvas below.

The elements are very simple to position with intelligent snap-to-grid alignment. Double-clicking an element allows you to edit its text content. You can also modify and fine tune the properties of each element by using the hovering property editor that shows up when an element has focus.

Overall, I am very impressed with Mockups For Desktop. I highly recommend it for anyone looking for a cost-effective way to design and layout user interfaces. It has become an invaluable part of my software tool set.

Disclaimer: I was given a free license to the application for trying it out and reviewing it on my blog.

Sep 18 / Nizam Sayeed

Go to top of page using jQuery

This one liner jQuery snippet will force your browser to go to the top of your page:

$('html, body').animate({ scrollTop: 0 }, 0);

If you want to add some smooth scrolling:

$('html, body').animate({ scrollTop: 0 }, 'slow');
Aug 14 / Nizam Sayeed

Windows 7, 64-bit Python and easy_install

I recently downloaded and installed Windows 7 RTM on my laptop. I upgraded from 32-bit XP to a 64-bit flavor of Windows 7. I decided to install a 64-bit version of Python to take advantage of the 6GB of memory installed on my laptop. All well and good.

I proceeded to grab and set up easy_install which installed without any issues. Things started to go awry when I tried to actually install a package using easy_install. I started getting the following error:

Cannot find Python executable C:\Python25\python.exe

It turns out that this has been an issue with setuptools and 64-bit Python for a while:

There is an active ticket for setuptools as well (with the last update dated August 11, 2009 at the time of writing):

Hopefully, it will be resolved soon. My solution was to fall back to 32-bit Python in the interim.

May 8 / Nizam Sayeed

Customized comment notifications from Django

I recently had to implement a way to send notifications (using the excellent django-notification app developed by James Tauber) to users whose content is commented on in my Django web app. However, I wanted the owner/creator of the original content to get a more customized notification message. For example, if the model instance being commented on was an idea from an idea sharing app, I would like the notifications to look something like:

John Smith commented on your idea – “Switching to a better version control system”:

That’s a brilliant idea. I totally agree with you. Let’s make this happen.

Looks a lot better than just: “John Smith commented on something.”

Okay, so the first thing we need to do is to define the notice type for comments. Here are the values I used for my notice type:

  • label: comment_posted
  • display: Comment Posted
  • description: someone posted a comment for your content

from django.db.models import signals, get_app
from django.utils.translation import ugettext_noop as _
from django.core.exceptions import ImproperlyConfigured

    notification = get_app( 'notification' )
    def create_notice_types( app, created_models, verbosity, **kwargs ):
        notification.create_notice_type( 'comment_posted', 
            _( 'Comment Posted' ), 
            _( 'someone posted a comment for your content' ) )
    signals.post_syncdb.connect( create_notice_types, sender = notification )    
except ImproperlyConfigured:
    print 'Skipping creation of NoticeTypes as notification app not found'

The next step is to create a signal handler and wire it to the post_save signal for the Comment model.

from django.db.models import signals
from django.contrib.comments.models import Comment
from django.contrib.contenttypes.models import ContentType
from django.utils.translation import ugettext as _
from django.core.exceptions import ImproperlyConfigured
from django.db.models import get_app

    notification = get_app( 'notification' )
except ImproperlyConfigured:
    notification = None

# valid content types
VALID_CTYPES = [ 'link', 'note', 'update' ]

def send_comment_notification( sender, **kwargs ):
    # no point in proceeding if notification is not available
    if not notification:

    if 'created' in kwargs:
        if kwargs[ 'created' ] == True:
            # get comment instance
            instance = kwargs[ 'instance' ]
            # get comment's content object and its ctype
            obj = instance.content_object
            ctype = ContentType.objects.get_for_model( obj )
            # check for valid content types
            if not in VALID_CTYPES:
            # for customized notification message
            if == 'link':
                type = _( 'your link' )
                descr =
            elif == 'note':
                type = _( 'your note' )
                descr = obj.title
            elif == 'update':
                type = _( 'your status update' )
                descr = obj.update
            # send notification to content owner
            if notification:
                data = {
                    'comment': instance.comment, 
                    'user': instance.user,
                    'type': type,
                    'descr': descr,

                # notification is sent to the original content object
                # owner/creator
                notification.send( [ obj.user ], 'comment_posted', data  )

# connect signal
signals.post_save.connect( send_comment_notification, sender = Comment )

Nothing fancy or complicated here. Some things to note:

  • I created a list of content type names called VALID_CTYPES which I use to control which content types trigger the creation of a custom notification.
  • Based on the content type’s name, I define some text that will be used for in the notification message (type).
  • I also assign a descriptive name of the item in question (descr). The descriptive name should map to whatever field in the model instance that best describes that instance. You can also use __unicode__() if you wish. You just have to make sure that it is defined in the model class and is suited for display in the notification message.
  • The notification is only sent to the owner/creator of the content being commented on.

Finally, we create a simple template for the notification e-mail and put it in templates/notification/comment_posted/:

{% load i18n %}
{% blocktrans with comment as comment and type as type and descr as descr and user.get_full_name as user %}
{{ user }} commented on {{ type }} - "{{ description }}":

{{ comment }}
{% endblocktrans %}

There you have it. Some food for thought:

  • In the code above, I only send the notification to the owner/creator of the content. However, you could easily add some code to grab a list of all the users who previously commented on the same content object and spam them as well (ala Facebook)!
  • If you have permalinks defined for your models, you could pass the actual content object to the notification template and use get_absolute_url() to display a direct link to the object’s view.
Apr 27 / Nizam Sayeed

Dynamic Django queries (or why kwargs is your friend)

A very easy way to dynamically build queries in Django is to use Python kwargs (keyword arguments).

Let’s say we have a model that looks something like this:

class Entry( models.Model ):
    user = models.ForeignKey( User, related_name = 'entries' )
    category = models.ForeignKey( Category, related_name = 'entries' )
    title = models.CharField( max_length = 64 )
    entry_text = models.TextField()
    deleted_datetime = models.DateTimeField()

Our goal is to dynamically build a query as we go in a view. Using kwargs, we can easily do something like this in our view code:

kwargs = {
    # you can set common filter params here

# will return entries which don't have a deleted_datetime
if exclude_deleted:
    kwargs[ 'deleted_datetime__isnull' ] = True

# will return entries in a specific category
if category is not None:
    kwargs[ 'category' ] = category

# will return entries for current user
if current_user_only:
    kwargs[ 'user' ] = request.user

# will return entries where titles match some search query
if title_search_query != '':
    kwargs[ 'title__icontains' ] = title_search_query

# apply all filters and fetch entries that match all criteria
entries = Entry.objects.filter( **kwargs )

Its that simple. This approach seems quite pedestrian when you think about it. However, I didn’t find any examples online which actually shows this in use. It may be useful for someone new to Django.

UPDATE (Apr 30, 2009 @ 9:39AM) Based on the comments I received, I wanted to update this post a little bit. It was mentioned that this approach may not work if you use Q objects for complex lookups. It turns out that QuerySet filter() accepts both args and kwargs. So you could actually do something like:

kwargs = { 'deleted_datetime__isnull': True }
args = ( Q( title__icontains = 'Foo' ) | Q( title__icontains = 'Bar' ) )
entries = Entry.objects.filter( *args, **kwargs )

Very cool indeed. Django never ceases to amaze me.