Category: django

Adding Time taken to respond to a request in the header of a Django Rest Framework Response. A suitable place to use a Mixin or not?

Ever wanted to add the time taken for a response to your API, so the client knows how long that takes?
I first noticed this cool feature on AWX, a management platform for ansible playbooks.

There’s response headers looked like this:

awx-api-timed-response

So I checked out their source code and copied how they did it:

In generics.py the package extends the base DRF ApiView class and adds a bit of stuff, the important stuff for us is:


    def finalize_response(self, request, response, *args, **kwargs):
        ...
        response = super().finalize_response(request, response, *args, **kwargs)
        time_started = getattr(self, 'time_started', None)
        response['X-API-Node'] = settings.CLUSTER_HOST_ID
        ...
        
    def initialize_request(self, request, *args, **kwargs):
        ...
        self.time_started = time.time()
        ...

So I created a class extending for ApiView:


class APIView(views.APIView):
    '''
    Add timing to the base APIView class
    '''
    def initialize_request(self, request, *args, **kwargs):
        self.time_started = time.time()
        return super().initialize_request(request, *args, **kwargs)
        
    def finalize_response(self, request, response, *args, **kwargs):
        response = super().finalize_response(request, response, *args, **kwargs)
        time_started = getattr(self, 'time_started', None)
        if time_started:
            time_elapsed = time.time() - self.time_started
            response['X-API-Time'] = '%0.3fs' % time_elapsed
        return response

Now all I needed to do was entend from my_package.ApiView instead of restframework.views.ApiView and I would get an X-API-Time header.

It worked well for a while, but then I needed to use more Generic Class Based Views (Which extend from ApiView)…check out CDRF. So I could just add these methods, to the extended versions of restframework.generics.ListCreateView etc.

But that just doesn’t feel right. Simply because I can see I am repeating myself multiple times. I just be able to define this functionality once and have all similar objects (those that extend from ApiView) have that functionality…

Using a Mixin to add the Time taken to Respond

I just created a mixin, by extracting those functions into a single class that inherits from nothing:


class TimedAPIMixin:
    def initialize_request(self, request, *args, **kwargs):
        self.time_started = time.time()
        return super().initialize_request(request, *args, **kwargs)

    def finalize_response(self, request, response, *args, **kwargs):
        response = super().finalize_response(request, response, *args, **kwargs)
        time_started = getattr(self, 'time_started', None)
        if time_started:
            time_elapsed = time.time() - self.time_started
            response['X-API-Time'] = '%0.3fs' % time_elapsed
        return response

Using them in the other views:

 

 


from rest_framework import views
from rest_framework import generics


class APIView(TimedAPIMixin, views.APIView):
    pass


class ListCreateAPIView(TimedAPIMixin, generics.ListCreateAPIView):
    pass


class ListAPIView(TimedAPIMixin, generics.ListAPIView):
    pass

It only works if the mixin is the first thing it inherits from, otherwise the first class will take preference.

Authenticated Functional Tests with Selenium and Django

In the Test Driven Development Book for Python and Django by Harry Percival called Obey The Testing Goat, there is a chapter about enhancing the functional test base class and adding pre-authentication so you don’t need to login via the login screen with Selenium.

It uses a custom Email Authentication Backend, but I needed to implement this on a standard: django.contrib.auth.backends.ModelBackend.

My First Attempt


    def create_pre_authenticated_session(self, user):
        '''Create an authenticated user quickly'''
        session = SessionStore()
        session[SESSION_KEY] = user.pk
        session[BACKEND_SESSION_KEY] = settings.AUTHENTICATION_BACKENDS[0]
        session.save()
        # visit domain (404 quickest)
        self.browser.get(self.live_server_url + "/404_no_such_url/")
        self.browser.add_cookie(dict(
            name=settings.SESSION_COOKIE_NAME,
            value=session.session_key,
            path='/',
        ))

I ran my functional test and something weird was happening, the cookie was getting killed right after this method is called an going to any page.

So I compared based on the cookie itself compared to one that existed in firefox developer tools.

The difference was the httpOnly thingy. So I added it…


        self.browser.add_cookie(dict(
            name=settings.SESSION_COOKIE_NAME,
            value=session.session_key,
            path='/',
            secure=False,
            httpOnly=True
        ))

Nothing changed, the cookie was still gone.

So then I compared an existing decoded session with the one created via the method above.

To find the decoded session:

$ python manage.py shell
[...]
In [1]: from django.contrib.sessions.models import Session

# substitute your session id from your browser cookie here
In [2]: session = Session.objects.get(
    session_key="8u0pygdy9blo696g3n4o078ygt6l8y0y"
)

In [3]: print(session.get_decoded())
{'_auth_user_id': 'obeythetestinggoat@gmail.com', '_auth_user_backend':
'accounts.authentication.PasswordlessAuthenticationBackend'}

The Session Difference

I noticed there was a difference a working session looked like this:

{'_auth_user_id': '1', '_auth_user_backend': 'django.contrib.auth.backends.ModelBackend', '_auth_user_hash': '6a34097f6dab2a1fc68f262e9e67186d2ad5ba93'}

whereas the one I created looked like this:

{'_auth_user_id': 1, '_auth_user_backend': 'django.contrib.auth.backends.ModelBackend'}

So the _auth_user_hash was a problem. I search the django source and found it in auth.

So I set the hash session key with: session[HASH_SESSION_KEY] = user.get_session_auth_hash()

It then worked.

The Solution


    def create_pre_authenticated_session(self, user):
        '''Create an authenticated user quickly'''
        session = SessionStore()
        session[SESSION_KEY] = user.pk
        session[BACKEND_SESSION_KEY] = settings.AUTHENTICATION_BACKENDS[0]
        session[HASH_SESSION_KEY] = user.get_session_auth_hash()
        session.save()
        # visit domain (404 quickest)
        self.browser.get(self.live_server_url + "/404_no_such_url/")
        self.browser.add_cookie(dict(
            name=settings.SESSION_COOKIE_NAME,
            value=session.session_key,
            path='/',
            secure=False,
            httpOnly=True
        ))

Deploying a django app with dedicated web and db servers

One of the many architectural decisions that will start to impact you when you get to a level where you need to scale is splitting you db and app. Typically we start on a budget and have to share resources but ideally you want to start out separate. The reasons is that the db server will know exactly how much RAM is available to it at all times and will hence improve the consistency and reliability.

Provision 2 Servers

To start off provision 2 (ubuntu) servers, to label things give each a fully qualified domain name like web.myserver.com and db.myserver.com

Then do a basic security and authentication setup on both servers.

The App Server

To setup the app server you can use this guide which uses python 3.6, Nginx, gunicorn and mysql. Just skip the database setup part.

The Database Server

Install postgres.

We need a role (user) for the database and because this role will be adding extensions it needs to be a superuser.

CREATE ROLE dbuser LOGIN PASSWORD 'mydbpass' SUPERUSER;

Importantly we need to look at django’s optimal postgres config

ALTER ROLE dbuser SET client_encoding TO 'utf8';
ALTER ROLE dbuser SET default_transaction_isolation TO 'read committed';
ALTER ROLE dbuser SET timezone TO 'UTC';

Then create the database:

CREATE DATABASE myproject;

Ok…so now fill out the DATABASES setting in your application and make sure the HOST is the internal ip as the servers are within the same data-center hopefully.

But we will need to configure postgres to allow and listen for connections from the internal network. We don’t want public ip’s to have access to it only our other app server within the same datacentre. I’ve done this with MySQL but forgot how to it, so I’m searching how to do it now.

First thing is setup the uncomplicated firewall with:


sudo ufw enable
sudo ufw allow OpenSSH
sudo ufw status

Now we want to enable connections from our app server:

sudo ufw allow from app_server_internal_ip_address to any port 5432

Log into psql and set it to listen on all ip’s:

ALTER SYSTEM SET listen_addresses = '*';

then reload the server:

SELECT pg_reload_conf();

Check where your pg_hba.conf is with:

SELECT name, setting FROM pg_settings WHERE category = 'File Locations';

then add the following line:


# IPv4 local connections:
host    all             all             10.0.0.4/32            md5

Restart


sudo systemctl restart postgresql

Test with the postgres client on the app server:

sudo apt install postgresql-client

There are a few performance tweaks you can do, but I’m always inclined to leave it standard before doing that.

https://www.digitalocean.com/community/tutorials/how-to-secure-postgresql-against-automated-attacks

Allow remote connections to PostgreSQL

https://stackoverflow.com/questions/22080307/access-postgresql-server-from-lan