← Back to context

Comment by christoff12

8 hours ago

I built the first half of my career as "a guy who knows SQL" (and Excel macros but I digress). I then rode the early wave of Analytics Engineering.

dbt is kinda like Vite (dbt = data build tool) for folks working with data warehouses. Their biggest contribution was a mindset shift that applied principles of the SDLC to the traditional BI/Analytics space.

Almost overnight, analysts went from building business logic in GUIs like Talend or Tableau to code-based models (SQL) checked into git repos instead. It took what Looker was doing with LookML and generalized it across the BI stack.

This shift (+ associated tooling) resulted in less brittle data pipelines, increased uptime for dashboards/reporting, and more sanity when working with more than 2-3 people in a data environment.

Imagine a situation where you're at an e-commerce company and need to reconcile orders from Woocommerce with shipments in ShipStation, returns from tickets in HubSpot, and refunds issued in Stripe. dbt simplifies the management of the relationships between these various systems.

Based on this, you can build data models that allow you and, increasingly, your business stakeholders to answer questions like "Which SKUs have seen an uptick in refunds due to reason X this quarter?" and "Where were they shipped?"

The benefit of having standard abstractions means you can build metrics on top of the models as [gkapur](https://news.ycombinator.com/item?id=41853925) mentions such that "revenue" is the same when marketing pulls it for calculating CAC as when finance pulls it their monthly reports, etc.