Logging of python is great and the intagracion of django with python logging, it is also very good. But sometimes we have to run multiple instances (processes) and want to save the logs without having problems with files opened by multiple processes.

django-logstream solves this problem. It runs as a service (separate process) that receives logs of different instances, thus allowing multiple processes to the log stored in one file without any problem.

Currently, django-logstream ZeroMQ used for interprocess communication and now with integrated encryption!

Features currently implemented:

  • alias for receive multiple streams.
  • logrotate by time interval.
  • encription and hash of all messages.

Features implemented in future:

  • size logrotate option.
  • rabbitmq backend. (not prioritary)
  • redis backend. (not prioritary)

How-To install

pip install django-logstream

How-To setup logstream daemon

As a first step, we add LOGSTREAM_STORAGE_PATH to the This will tell the server where they host all the logs.

Add django_logstream.server to INSTALLED_APPS list.

And as the last step, we start the service with: python2 logstreamd.

Other configuration options:

ZeroMQ path for bind address. Default: ipc:///tmp/logstream_receiver
Put logstream to secure mode. In this mode only accepts encrypted and validated with sha1 hash messages. Default: False
Set a interval in minutes for logrotate of logs. Default: 60

How-To setup logstream client

As the first and only step, configure your logging in django

This is an posible example:

    'handlers': {
        'logstream': {
            'level': 'DEBUG',
            'class': 'django_logstream.client.handlers.threaded.ZeroMQHandler',
            'alias': 'myfirsttest',
            'address': 'ipc:///tmp/logstream_receiver', # this is a default
            'encrypt': True # default is False
    'loggers': {
        'yourlogger': {
            'level': 'DEBUG',
            'handler': ['logstream'],
            'propagate': False,