Daily Pulse

CREATED BY

NL

1 Template

11 Views

LAST UPDATED

January 6, 2026

INTEGRATIONS USED

DESCRIPTION

Overview

This workflow automates daily business reporting by querying your data warehouse, compiling meeting transcripts, and delivering AI-formatted summaries directly to Slack. It leverages Google BigQuery for data extraction, Google Docs for transcript aggregation, and Slack for real-time delivery.

How It Works

  • Queries your data warehouse via Google BigQuery to pull key business metrics.

  • Retrieves and reads meeting transcripts and uses AI to format the data and summarize key insights from team conversations.

  • Sends both the formatted business data and meeting summaries to your designated Slack channel.

Note: The folder names used in this template are default, feel free to rename them to fit your organization.

Use Cases

  • Founders and operators who want a daily snapshot of business performance and team activity.

  • Remote teams staying aligned on key metrics and conversations without manual reporting.

  • Revenue and ops leaders consolidating data from multiple sources into one morning briefing.

Setup Requirements

  • Google BigQuery credentials and your Google Cloud Project ID to query your dataset.

  • OAuth connections for Google Docs and Google Drive to access and read document contents.

  • Slack credentials with permissions to post messages in selected channel.

HOW DO YOU SET THIS UP?

1.

Connect Your Data Source

This example uses BigQuery, but the same setup applies to similar data sources such as Snowflake or Databricks. From the template, select the Query Business Data tab. Input your Google Cloud Project ID and the SQL query you want to run. Use AI to help build your query if needed. Next, define the column names you'll be using. There's no auto-detect or bulk import functionality, so you'll need to add each column name individually.

2.

Drag your columns into the node

Drag the columns you defined in Step 1 into the node to pass them to the model. Use the included prompt or replace it with your own, then select your AI model.

3.

Select 'Meet Recordings' Folder

By default, Google adds meeting transcripts and/or summaries to a drive folder titled 'Meet Recordings'. This should be the folder where all of your transcripts or meeting summaries are housed.

4.

Copy files to Call Transcripts

Create a Google Drive folder called Call Transcripts and set it as the Destination Folder. This folder serves as a processing stage. There is another node later in the workflow that will archive these files, ensuring they aren't being included in future analysis. * These are the default folder names used in this template, feel free to rename them to fit your organization.

5.

Insert Prompt and Choose Model

Use the included prompt or add your own. Select your preferred model from the dropdown.

6.

Slack Message Sender

Authenticate with your Slack credentials and select your destination channel.

7.

Archive Call Transcripts

Enter the URL of your "Call Transcripts" folder as the Source. Next, create a separate Google Drive folder designated for storing your archived transcripts, and set this folder as the archived location. This process guarantees that older files are archived appropriately, allowing the analysis to run only on the new files each day. Connect this node to the Archive Call Transcripts node.

8.

Archive Meet Recordings

Enter your "Meet Recordings" folder URL as the Source, and your "Meet Recordings Archive" folder URL as the Destination. Same as above, this ensures that only new files are processed each day.

9.

Add a Time Trigger

From the Nodes menu, select Triggers > Create a Time Trigger. We recommend scheduling this for early morning (e.g., 5am local time), before your team starts working, so all calls and transcripts from the previous day are captured and nothing is missed mid-recording.