SCRAPERS
SATELLITE
DEEP LEARNING
PIPELINES
REGISTERS
COMPUTE
ECONOMICS
GEOSPATIAL
INFRASTRUCTURE
RESEARCH

Still unnamed. But building serious stuff.
We create scalable pipelines, run large-scale scrapers, and train deep learning models to push the boundaries of economic research — from register data to remote sensing.
Scroll to explore

What We Do

We turn messy data into structured insight.

The Lab is a platform for building data-driven infrastructure to support ambitious research in economics and the social sciences. We don't just analyze data; we design the tools, workflows, and datasets that make complex analysis possible.

Rooted in curiosity and technical experimentation, the Lab brings together collaborators, students, and projects that push the boundaries of what's possible with modern data. It serves as a way to scale and support research that combines empirical rigor with engineering intuition.

It's a space to explore neglected problems, build practical solutions, and contribute lasting tools to the open research community.

Our current focus areas include:

Satellite imagery for economic development and infrastructure analysis
Massive-scale scraping with rotating proxies, headless browsers, and robust schedulers
Register-based data workflows to streamline and harmonize microdata
Deep learning for classification, geospatial inference, and applied modeling
Political economy datasets built from legislative records and public policy archives
Labor economics with a structural and spatial angle
Experimental infrastructure for reproducible pipelines, high-performance computing, and dataset creation

Mentorship

We mentor one MSc student per year.

You'll contribute to real research tools, not toy projects. Work on scraping infrastructures, remote sensing pipelines, and registry-based workflows that actually matter. The ideal student is curious, fast-moving, and not afraid of scale.

Applications for 2025-2026 will open soon.

Selected Projects

Coming Soon

We're currently working on some exciting projects that will be showcased here soon.

Infrastructure & Compute

Yes, we host our own compute.

We run a dedicated, water-cooled Threadripper server with:

Hardware Specifications
  • 48 cores, 256GB RAM, multi-GPU setup
  • Water-cooled Threadripper architecture
  • High-speed NVMe storage
Software Architecture
  • Dockerized architecture for modular deployments
  • VM-based isolation for scraping, ML, and data prep
  • Built to scale — for deep learning, geospatial pipelines, and large-scale scraping jobs
Settings