HomeBlogQuerying Postgres and Google Sheets Together
Tutorial
PostgreSQL
Google Sheets
Federation

Querying Postgres and Google Sheets Together with Kvatch in 5 Minutes

September 2, 2025
5 min read

Data rarely lives in one place. You might have structured data in Postgres, but your team still tracks budgets or projects in Google Sheets. What if you could query them both together with plain SQL — without building ETL pipelines?

That's exactly what Kvatch CLI enables: it treats all your sources as one federated dataset. In this quick tutorial, you'll learn how to connect Postgres and Google Sheets and query them together in under five minutes.


🛠 Prerequisites

  • Kvatch CLI installed
  • A Postgres table called projects. We set-up a read-only example for you on Neon.
  • A Google Sheet, with a tab called employees. Again, we made one for you and shared here

⚡ Step 1. Define Your Plan

Kvatch works from a simple YAML file (a plan) that defines your connectors and datasets.

Create a file called plan.yaml:

name: postgres_sheets_example
storage:
  type: sqlite
  data_store_path: combined.db

connectors:
  - name: projects_db
    type: POSTGRES
    connection:
      host: "ep-jolly-shadow-a2hk38fo-pooler.eu-central-1.aws.neon.tech"
      port: 5432
      database: "examples"
      username: "kvatch_reader"
      password: "Readonly123!"
      ssl_mode: require
    desc: "Readonly connection to the projects database hosted on Neon"

  - name: team_sheets
    type: GOOGLESHEET
    connection:
      spreadsheet_id: "1sgZU3bv0CGq0nOD6xTvJAK9YcedFJjE54_TcnjFwRhE"
      read_range: "employees"  
    desc: "Employee info from Google Sheets"

datasets:
  - name: projects
    connector_name: projects_db
    type: SQL
    query: "SELECT id, name, owner_email FROM projects"

  - name: employees
    connector_name: team_sheets
    type: GOOGLESHEET
    config:
      header_row_no: 1
    query: "employees"

This tells Kvatch:

  • Connect to Postgres and read the projects table.
  • Connect to a Google Sheet and read the employees tab and the 1st row is a column headers.

⚡ Step 2. Write a Federated Query

Now define a new dataset that joins them:

  - name: project_with_owner
    connector_name: federated
    type: SQL
    query: |
      SELECT p.name AS project_name,
             e.employee_name,
             e.department
      FROM projects p
      JOIN employees e
        ON p.owner_email = e.email
    children:
      - dataset_name: projects
      - dataset_name: employees

    output:
      dataset_name: project_with_owner
      verbose: false
      format: table
      limit: 10
        

Here we're treating Sheets and Postgres as if they were one database.


⚡ Step 3. Set the desired ouputs

Define what results you want to see when you run the query:

output:
    dataset_name: project_with_owner
    verbose: false
    format: table
    limit: 10
        

Here we're saying we want the results as a table, limited to 10 rows.


⚡ Step 4. Run It

Run the query with:

federate query -p plan.yaml

And you'll see results like:

+---------------------+----------------+-------------+
| project_name        | employee_name  | department  |
+---------------------+----------------+-------------+
| Data Pipeline       | Alice Smith    | Engineering |
| Sales Dashboard     | Bob Johnson    | Analytics   |
| Product Redesign    | Carol Nguyen   | Product     |
| Marketing Campaign  | Frank Martin   | Marketing   |
| Legal Tracker       | Helen Kim      | Legal       |
+---------------------+----------------+-------------+

🎉 Done in 5 Minutes

That's it — no ETL jobs, no copy-pasting between systems. Just one query that spans Postgres and Google Sheets.

With Kvatch you can:

  • Combine multiple Postgres or MySQL databases
  • Mix APIs, JSON, and CSV files with structured data
  • Keep sources in sync without moving data around

🚀 Try It Yourself

🎉 Kvatch CLI is Now Available!

No more waiting! Download Kvatch CLI today and start federating your data sources.

Ready to start federating your data?

Download Kvatch CLI and start querying across all your data sources in minutes.