techlogia

Container

Docker Container for Data Processing

from€1,490

Custom pipeline for your data — cron jobs, webhooks, REST API, deployable on any Docker host. ETL, scraping, aggregation.

What you get

Package contents

  • Data import from CSV, JSON, Excel, APIs or databases
  • Processing logic: ETL, aggregation, filtering, joins
  • Cron-based or webhook-triggered execution
  • Python (pandas, requests) or Go (for performance)
  • Database connection: PostgreSQL, MySQL, MongoDB, ClickHouse
  • Docker Compose setup — one command to start
  • Logging with structlog, errors structured in JSON
  • Health check endpoint (HTTP 200 when all OK)
  • Retry logic on API errors (exponential backoff)
  • Rate limiting when third-party APIs are called
  • Data export to CSV, JSON, Parquet, Excel
  • Optional: webhook on pipeline end (e.g. Slack notification)
  • Deployment instructions for Hetzner/AWS/GCP
  • Monitoring dashboard via Grafana (optional, +290 EUR)
  • 30 days bugfixes post-launch

Communicated transparently

What's not included

  • ×Frontend / UI for the data (separate as web-app package)
  • ×Server / cloud costs (you pay)
  • ×Third-party API license costs
  • ×Live streaming / realtime pipelines (Kafka etc.) — separate from 2900 EUR
  • ×More than 5 data sources (hourly rate)
  • ×Complex ML models (separate from 1900 EUR)
  • ×GDPR-compliant PII hashing/pseudonymization (separate from 290 EUR)

These items can be requested separately — we make an individual offer.

Security & production-readiness

Security comes standard.

What cheap providers skip is standard with us — also in this package:

  • HTTPS / SSL with auto-renewal
  • Firewall, Fail2Ban & rate-limiting
  • Dependency scan for known CVEs
  • Backup strategy in place
  • No secrets ever in the repository
  • GDPR-compliant cookie banner (for EU clients)
  • Code review by an experienced engineer before go-live
  • 30 days post-launch support (bugfixes included)

FAQ

What clients usually ask

How large can the data be?+

Up to ~10 GB per run handles well. Larger: splitting/chunking needed (from 290 EUR add-on).

How fast does the pipeline run?+

Depends on data volume + source. Example: import + transform 100k rows from CSV ~30s. APIs with rate limits take longer.

What if the data source API has problems?+

Built-in retry logic (3 attempts, backoff). On permanent failure: alert via email/Slack, pipeline continues on next cron.

Can I extend the pipeline later?+

Yes, modular build. New data source: ~290 EUR. New processing logic: hourly rate or flat.

Where does the pipeline run?+

Recommendation: Hetzner CX21 (4 EUR/month) for smaller pipelines. AWS/GCP also possible.

What about GDPR?+

If personal data: DPA contract with you, servers in DE/EU, optional PII hashing as add-on (290 EUR).

Important note

The price shown is a non-binding estimate. Actual effort is determined individually after a free briefing call and provided in writing.

Have a project?

Let's bring your idea to life together. We're happy to advise you with no obligation.

Get in Touch