Dawarich: Self-Hosted Google Location History Alternative

Dawarich: Self-Hosted Google Location History Alternative

Why You Should Run Dawarich Instead of Google Location History

Google Location History is convenient—it just works. But it's also a permanent record of everywhere you've been, sitting in Google's data centers and subject to their privacy policy changes, law enforcement requests, and algorithmic analysis. If you're running a homelab, self-hosting your location data is the logical next step: Dawarich is a lightweight, privacy-first alternative that imports your Google Takeout data, stores it locally, and integrates seamlessly with other self-hosted services like Immich.

This guide is for people who already run Docker and understand why self-hosting matters. I'm assuming you've got a functioning Docker host and basic familiarity with container networking.

Prerequisites and What You'll Need

Software versions tested:

  • Docker 26.1.3 (or later)
  • Docker Compose 2.28.1
  • Dawarich 0.91.0
  • PostgreSQL 15.7
  • Ubuntu 24.04.1 LTS or Debian 12.6

System requirements:

  • 2 CPU cores minimum (4 cores recommended for background processing)
  • 4GB RAM minimum (8GB if you're importing years of location history)
  • At least 20GB free disk space for database and location data
  • Docker and Docker Compose installed and working
  • A Google Takeout export with location history (TimelineObjects format)

Gotcha #1: Dawarich requires PostgreSQL—SQLite won't work. The background worker that processes location imports is resource-intensive, so undersizing the database VM will cause timeouts on large imports.

Set Up Dawarich with Docker Compose

Dawarich is designed for containerization. On my T5810 with 24GB RAM running Ubuntu 24.04, I run the full stack (Dawarich, PostgreSQL, Redis) in Docker with persistent volumes for the database.

Create a project directory and generate a strong secret key:


mkdir -p ~/docker/dawarich && cd ~/docker/dawarich
python3 -c "import secrets; print(secrets.token_urlsafe(50))" > .env
echo "SECRET_KEY=$(cat .env)" >> .env

Now create the docker-compose.yml:


version: '3.9'

services:
  db:
    image: postgres:15.7-alpine
    container_name: dawarich_db
    environment:
      POSTGRES_DB: dawarich
      POSTGRES_USER: dawarich
      POSTGRES_PASSWORD: ${DB_PASSWORD:-changeme}
    volumes:
      - dawarich_db:/var/lib/postgresql/data
    restart: unless-stopped
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U dawarich"]
      interval: 10s
      timeout: 5s
      retries: 5

  redis:
    image: redis:7.2-alpine
    container_name: dawarich_redis
    restart: unless-stopped
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
      interval: 10s
      timeout: 5s
      retries: 5

  web:
    image: ghcr.io/dawarich-app/dawarich:0.91.0
    container_name: dawarich_web
    environment:
      DATABASE_URL: postgresql://dawarich:${DB_PASSWORD:-changeme}@db:5432/dawarich
      REDIS_URL: redis://redis:6379/0
      SECRET_KEY: ${SECRET_KEY}
      DEBUG: "false"
      RAILS_ENV: production
    ports:
      - "127.0.0.1:3000:3000"
    depends_on:
      db:
        condition: service_healthy
      redis:
        condition: service_healthy
    volumes:
      - dawarich_storage:/app/storage
    restart: unless-stopped

  sidekiq:
    image: ghcr.io/dawarich-app/dawarich:0.91.0
    container_name: dawarich_sidekiq
    command: bundle exec sidekiq -c 4
    environment:
      DATABASE_URL: postgresql://dawarich:${DB_PASSWORD:-changeme}@db:5432/dawarich
      REDIS_URL: redis://redis:6379/0
      SECRET_KEY: ${SECRET_KEY}
      RAILS_ENV: production
    depends_on:
      - db
      - redis
    volumes:
      - dawarich_storage:/app/storage
    restart: unless-stopped

volumes:
  dawarich_db:
  dawarich_storage:

Set a proper database password in your `.env` file:


echo "DB_PASSWORD=$(python3 -c 'import secrets; print(secrets.token_urlsafe(32))')" >> .env

Bring up the stack:


docker compose up -d
docker compose logs -f web

Wait for the web service to report "Listening on" on port 3000. On first run, Rails will initialize the database schema automatically. This takes 30-60 seconds. Once you see that message, navigate to http://localhost:3000 and create your admin account.

Import Google Takeout Location Data

Before you can visualize your location history, you need to export it from Google. This process takes 15 minutes but gives you a complete, offline copy of everything Google knows about where you've been.

Go to takeout.google.com, deselect all services, then select only "Location History". Request the archive in JSON format (not KML). Google will email you when it's ready—usually within a few hours.

Extract the archive locally. Google's TimelineObjects format is nested deeply:


unzip -q takeout-*.zip
find . -name "*.json" -path "*/Location/*" | head -1
# You'll find something like:
# ./Takeout/Location History/Location History.json

In the Dawarich web UI, go to Settings → Import Location Data. Upload the Location History.json file. The Sidekiq worker will process it in the background. On the Sidekiq container, you'll see progress:


docker compose logs -f sidekiq | grep -i location

Gotcha #2: If your import stalls, check the Sidekiq logs and make sure the database hasn't run out of disk space. The import creates intermediate tables—on a large multi-year export, this can consume 5-10GB temporarily. After import completes, vacuum the database:


docker compose exec db psql -U dawarich -d dawarich -c "VACUUM FULL;"

Access and Visualize Your Location History

Once the import completes, go to the Maps section in Dawarich. You'll see an interactive map powered by Leaflet showing all your location points, clustered by density. The timeline sidebar lets you filter by date range. Click any cluster to zoom in and see individual points with timestamps.

To expose Dawarich securely to the internet, reverse-proxy it through Nginx or Caddy with TLS. Here's a minimal Caddy config:


# Inside your Caddyfile
dawarich.example.com {
    reverse_proxy 127.0.0.1:3000
}

Reload Caddy, and you've got HTTPS-encrypted access to your location data with automatic certificate renewal.

Integrate Dawarich with Immich for Photo Timeline Context

If you're already running Immich for photo management, connecting it to Dawarich adds location metadata to your photos based on timestamps. Immich can automatically tag photos with location information from Dawarich's API.

In Dawarich, go to Settings → API and generate an API token. Then in Immich's settings (Admin → External Libraries), add Dawarich as a location source:

  • Dawarich URL: http://dawarich_web:3000 (use the internal Docker hostname if Immich is in the same Compose stack)
  • API Token: Paste the token from Dawarich
  • Enable "Auto-tag photos with location"

Immich will reverse-geocode locations from Dawarich and attach them to photos with matching timestamps. This works best if you're photographing throughout the day—Dawarich records location fixes every few minutes on Android or every 15 minutes on iOS.

Common Issues and Troubleshooting

Import hangs at 50% or shows "504 Bad Gateway": The Sidekiq worker ran out of memory or timed out. Check available RAM:


free -h
docker stats dawarich_sidekiq

If the worker is using >80% of available memory, you need more RAM or you need to reduce the Sidekiq concurrency. Edit the compose file and change the Sidekiq command to bundle exec sidekiq -c 2 instead of -c 4.

Web container crashes on startup with "PG::ConnectionBad": The database didn't initialize properly. Check the DB logs:


docker compose logs db

If you see permission errors, the volume mount is broken. Drop the volume and restart:


docker compose down
docker volume rm dawarich_dawarich_db
docker compose up -d

Maps don't load or show blank: Verify the Leaflet tile server is accessible. If you're behind a restrictive firewall, Dawarich defaults to OpenStreetMap tiles. You can self-host tiles using TileServer GL, but that's a separate project.

No location points after import: Run a quick SQL query to confirm data was inserted:


docker compose exec db psql -U dawarich -d dawarich \
  -c "SELECT COUNT(*) as location_count FROM locations;"

If the count is 0, the import failed silently. Check the Sidekiq UI at http://localhost:3000/sidekiq (requires authentication) for failed jobs and error messages.

Next Steps: Keep Your Location Data Fresh

Dawarich is a static archive by default—it imports historical data from Google Takeout. For ongoing tracking, you'll need to export your location history regularly or use a mobile app that sends location to Dawarich's API directly. The Dawarich GitHub repo documents mobile integration options.

You now have a self-hosted, encrypted location history that's entirely under your control. Your data stays on your hardware, you can access it anytime, and you're not funding Google's surveillance infrastructure. Back up your dawarich_db and dawarich_storage volumes regularly—they're your entire location archive.

Related: If you need persistent mobile location tracking, pair this with OwnTracks (MQTT-based) or a similar open-source tracker that can push to Dawarich's API. For a deeper privacy setup, consider running Dawarich on a VPN behind a restricted firewall with no inbound access from the internet.

Read more