What is the best design pattern for batch insertion using the Django REST Framework?

Paul J

Background

I have a Django app that allows record insertion via the Django REST Framework.

Records will be periodically batch-inserted row-by-row by client applications that interrogate spreadsheets and other databases. The REST API allows these other applications, which handle data transformation, etc, to be abstracted from Django.

Problem

I'd like to decouple the actual record insertion from the API to improve fault tolerance and the potential for scalability.

Suggested Approach

I am considering doing this with Celery, but I've not used it before. I'm considering overriding perform_create() in my existing DRF ModelViewSets (perform_create() was added in DRF 3.0) to create Celery tasks that workers would grab and process in the background.

The DRF documentation says that perform_create() should "should save the object instance by calling serializer.save()". I'm wondering whether, in my case, I could ignore this recommendation and instead have my Celery tasks call on the appropriate serializer to perform the object saves.

Example

If for example I've got a couple of models:

class Book(models.Model):
    name = models.CharField(max_length=32)

class Author(models.Model):
    surname = models.CharField(max_length=32)

And I've got DRF views and serializers for those models:

class BookSerializer(serializers.ModelSerializer):
    class Meta:
        model = Book

class AuthorSerializer(serializers.ModelSerializer):
    class Meta:
        model = Author

class BookViewSet(viewsets.ModelViewSet):
    queryset = Book.objects.all()
    serializer_class = Book

class AuthorViewSet(viewsets.ModelViewSet):
    queryset = Author.objects.all()
    serializer_class = Author

Would it be a good idea to override perform_create() in e.g. BookViewSet:

def perform_create(self, serializer):
    create_book_task(serializer.data)

Where create_book_task is separately something like:

@shared_task
def create_book_task(data):
    serializer = BookSerializer(data=data)
    serializer.save()

I've not really been able to find any examples of other developers doing something similar or trying to solve the same problem. Am I overcomplicating it? My database is still going to be the limiting factor when it comes to physical insertion, but at least it won't block the API clients from queueing up their data. I am not committed to Celery if it isn't suitable. Is this the best solution, are there obvious problems with it, or are there better alternatives?

Sebastian

I find your approach is sound, Celery is great except for some border cases that can get a little nasty in my experience (but I wouldn't expect to run into that in the use case you outline in the question).

However, consider a simplified approach as follows using Redis. It has some pros and cons.

In BookViewSet:

from redis import StrictRedis
from rest_framework import viewsets, renderers

redis_client = StrictRedis()

class BookViewSet(viewsets.ModelViewSet):
    queryset = Book.objects.all()
    serializer_class = Book

    def perform_create(self, serializer):
        json = renderers.JSONRenderer().render(serializer.data)
        redis_client.lpush('create_book_task', json)

In a separate worker script:

from django.utils.six import BytesIO
from redis import StrictRedis
from rest_framework.parsers import JSONParser
from myproject import BookSerializer, Book

MAX_BATCH_SIZE = 1000

def create_book_task():
    bookset = []
    for json in redis_client.brpop(('create_book_task',)):
       stream = BytesIO(json)
       data = JSONParser().parse(stream)
       serializer = BookSerializer(data=data)
       assert serializer.is_valid()
       bookset.append(serializer.instance)
       if len(bookset) >= MAX_BATCH_SIZE:
           break

    if len(bookset) > 0:
        Book.objects.bulk_create(bookset)

while True:
    create_book_task()

Pros

  • You don't need to add Celery (Again, love it, but it makes testing a little trickier and can sometimes get a little hairy depending on workloads, configuration, etc)
  • It handles bulk creation, so if you get thousands of books submitted over a short timespan (seconds or less than a second) only a few inserts will be executed on the DB (as opposed to thousands of inserts)

Cons

  • You're taking care of the low level serialization yourself instead of Celery doing it "magically"
  • You will need to manage the worker script yourself (daemonizing it, maybe packaging it as a management command, taking care of restarts, etc.) instead of handing that off to Celery

Of course the above is a first approach, you might want to make it more generic to reuse for additional models, move the MAX_BATCH_SIZE TO your settings, use pickling instead of JSON or a variety of other adjustments, improvements or design decisions according to your specific needs.

In the end, I would probably go along with the approach outlined in my answer, unless there are several other tasks you anticipate will be offloaded to asynchronous processing where the case for using Celery would become much stronger.

PS: Since the actual insertion will be done asynchronously consider responding with a 202 Accepted response code instead of 201 Created (unless this screws up your clients).

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

From Dev

Using django message framework with rest_framework

From Dev

What is the benefit of this design pattern?

From Dev

What is a good design pattern for using an abstract superclass?

From Dev

What is the Hexagon design pattern

From Dev

Django Rest Framework: Best practices?

From Dev

Best design pattern for event listeners

From Dev

Django REST Framework Batch PUT (Update)

From Dev

What is the name for this design pattern?

From Dev

Django Rest Framework - Best way to deal with API parameter validation errors?

From Dev

What is Facet design pattern?

From Dev

django rest framework - using viewsets

From Dev

what is the control flow of django rest framework

From Dev

What is the best Material Design UI framework

From Dev

What is this design pattern called?

From Dev

What is best way to implement Viewholder design pattern.

From Dev

Design pattern best practices

From Dev

Using Django REST Framework as an authentication backend for Django

From Dev

Design Patterns - What pattern is this?

From Dev

MVP design pattern best practice

From Dev

What would be the best architectural pattern to store user files in Django?

From Dev

What is best Ruby Class design / pattern for this scenario?

From Dev

What is the best way to design REST URI for nested resources

From Dev

Django Rest Framework: Best practices?

From Dev

what is the control flow of django rest framework

From Dev

Django RDBMS model design for rest framework

From Dev

what is the "proper" way to use django REST framework?

From Dev

best pattern design for my project

From Dev

Best practice for permissions/filters in Django Rest Framework

From Dev

api design concept using django rest framework

Related Related

HotTag

Archive