Changelog

New updates and product improvements

We will be enforcing stricter limitations on the v0 and v1 GET Project Logs endpoint:


_10
GET /v1/projects/{ref}/analytics/endpoints/logs.all
_10
GET /v0/projects/{ref}/analytics/endpoints/logs.all

The restrictions are as so:

  • If neither ?iso_timestamp_start= nor ?iso_timestamp_end= is provided, the queried timestamp range will be the last 1 minute.
  • If either ?iso_timestamp_start= or ?iso_timestamp_end= is provided, the queried timestamp range will be a 1 minute window either before or after the provided query parameter.
  • If both ?iso_timestamp_start= and ?iso_timestamp_end= are provided, the maximum allowed queried timestamp range is 24 hours. The size of the permitted window may be subject to change in the future.

This change will go into effect at 2 April 12pm SGT (4am UTC).

Starting on April 4th, 2025, we will be enhancing the routing behavior for experimental routing on eligible Data API requests. This change affects GET requests to Data APIs only; the routing behavior for all other requests remains unchanged:

Current behavior: Round-robin distribution among all databases (primary and all read replicas) in your project

New behavior: Geo-routing that directs requests to the closest available database

This enhancement delivers a better experience for your users by minimizing latency to your project. You can maximize these benefits by strategically placing Read Replicas close to your major customer locations.

For more information on geo-routing and experimental routing, please visit our documentation

Today we’re announcing the official release of Dedicated Pooler - a PgBouncer instance co-located with your Postgres database for lower latency, better performance, and higher reliability.

This is available today for paid plans on the Supabase platform while free and paid plans continue to benefit from our Shared Pooler via Supavisor.

Why Dedicated Poolers

We created Dedicated Poolers to give you the ultimate flexibility in choosing the right connection type for your specific use case.

Now you have three options for connecting to your database and you can mix and match depending on your use case:

  1. Direct connections (All Plans): Recommended for when you are connecting using servers.
  2. Shared Pooler (All Plans): Recommended for when you are connecting using serverless functions (like Next.js or AWS Lambda) and/or IPv6-only networks.
  3. Dedicated Pooler (Paid Plans): Recommended for when you are connecting using serverless functions and you start to scale up. Available on Pro Plan and above.

How It Works

  • Dedicated Pooler (IPv6 Only) - Only transaction mode is available. For IPv4-only networks, purchase the IPv4 add on or use the Shared Pooler.
  • Shared Pooler (IPv4 + IPv6) - Both transaction and session modes are available. Use this Pooler only for networks that don't support IPv4 or if you need prepared statements.
  • Dedicated Pooler Prepared Statement Support (Coming Soon) - We plan to support prepared statements in transaction mode by upgrading all PgBouncer instances to version 1.21+.

Get started by navigating to your project’s Connect settings or learn more about Dedicated Poolers by reading the announcement or going through the docs.

On April 21, we are restricting certain SQL actions you can perform in your database’s auth, storage, and realtime schemas.

Why are we making these restrictions?

Supabase Auth, Storage, and Realtime services each rely on their respective schemas in order to function properly.

These restrictions prevent unintended side effects like third-party tooling and user defined changes altering schemas or their objects, such as migration tables and database functions, that could disrupt or break functionality.

What this means for your project

On April 21, you will no longer be able to perform the following actions on the auth, storage, and realtime schemas:

  • Create tables and database functions
  • Drop existing tables or database functions
  • Create indexes on existing tables
  • Perform destructive actions (i.e. INSERT, UPDATE, DELETE, TRUNCATE) on the following migration tables:
    • auth.schema_migrations
    • storage.migrations
    • realtime.schema_migrations
  • Revoking privileges on tables in these schemas from API roles (e.g. anon)

However, you will still have permissions to perform the following actions:

  • Create foreign keys referencing tables in the auth, storage, and realtime schemas
  • Create RLS policies and database triggers on the following tables:
    • auth.audit_log_entries
    • auth.identities
    • auth.refresh_tokens
    • auth.sessions
    • auth.users
    • storage.buckets
    • storage.migrations
    • storage.objects
    • storage.s3_multipart_uploads
    • storage.s3_multipart_uploads_parts
    • realtime.messages

How to determine if you’re affected?

  • Run the following query to check if you created any tables in the auth, storage, and realtime schemas:

_14
SET search_path = '';
_14
SELECT oid::regclass AS table_name
_14
FROM pg_class
_14
WHERE
_14
(relnamespace = 'auth'::regnamespace AND relowner != 'supabase_auth_admin'::regrole)
_14
OR (relnamespace = 'storage'::regnamespace AND relowner != 'supabase_storage_admin'::regrole)
_14
OR (
_14
relnamespace = 'realtime'::regnamespace
_14
AND relowner NOT IN (
_14
SELECT oid
_14
FROM pg_roles
_14
WHERE rolname IN ('supabase_admin', 'supabase_realtime_admin')
_14
)
_14
);

  • Run the following query to check if you created any database functions in the auth, storage, and realtime schemas:

_14
SET search_path = '';
_14
SELECT pg_catalog.format('%s(%s)', oid::regproc, pg_get_function_identity_arguments(oid::regproc)) AS function_name
_14
FROM pg_proc
_14
WHERE
_14
(pronamespace = 'auth'::regnamespace AND proowner != 'supabase_auth_admin'::regrole)
_14
OR (pronamespace = 'storage'::regnamespace AND proowner != 'supabase_storage_admin'::regrole)
_14
OR (
_14
pronamespace = 'realtime'::regnamespace
_14
AND proowner NOT IN (
_14
SELECT oid
_14
FROM pg_roles
_14
WHERE rolname IN ('supabase_admin', 'supabase_realtime_admin')
_14
)
_14
);

What you need to do

If any of the above queries return a result, you must move them to either the public schema or a schema that you’ve created. Otherwise, they will be deleted.

  • Here’s how you can move a table to another schema:

_10
CREATE SCHEMA IF NOT EXISTS my_custom_schema;
_10
ALTER TABLE storage.my_custom_table SET SCHEMA my_custom_schema;

  • Here’s how you can move a database function to another schema:

_10
CREATE SCHEMA IF NOT EXISTS my_custom_schema;
_10
ALTER FUNCTION storage.custom_function() SET SCHEMA my_custom_schema;

Additionally, if you're using Migrations or Branching, you'll need to patch your migrations to move these objects to your own schemas. E.g. if you have a migration 20250101000000_add_custom_table.sql like so:


_10
-- ...
_10
CREATE TABLE auth.my_custom_table (
_10
-- id int8 ...
_10
);
_10
-- ...

Then you need to edit it locally into:


_10
-- ...
_10
CREATE SCHEMA IF NOT EXISTS my_custom_schema;
_10
CREATE TABLE my_custom_schema.my_custom_table (
_10
-- id int8 ...
_10
);
_10
-- ...

Then you'll need to repair the migration history on the linked project:


_10
supabase migration repair --status reverted 20250101000000
_10
supabase migration repair --status applied 20250101000000

Here’s everything that happened with Supabase in the last month:

Deploy Edge Functions from the Supabase dashboard

Write your Edge Function in the dashboard using the AI Assistant and deploy it directly.

[Link]

Deploy Edge Functions from the CLI

Write your Edge Function locally and deploy it using the CLI and without having to install Docker.

[Link] [GitHub]

Deploy Edge Functions using the API

AI tools and other products that integrate with Supabase can now deploy Edge Functions using the Supabase API.

[Link] [GitHub]

Connect AI tools and LLMs to Supabase

We’ve published documentation on how to use the Model Context Protocol (MCP) to connect external AI tools to Supabase. Use natural language commands to perform operations in Supabase.

[Link] [Docs]

Third-party Auth is now less expensive

We’ve increased the MAU quota for using third-party authentication providers so it’s easier (and more cost-effective) to start using Supabase with an existing project that uses another auth provider.

[Link]

New billing documentation

Better explanations for how bills are computed, upgrading/downgrading subscriptions, and concepts such as Credits or Spend Caps.

[Docs]

Using Postgres as a Graph Database

pgRouting is a Postgres extension that can be used to add basic graph functionality to Postgres.

[Blog Post]

Quick Product Announcements

  • Invoke the SQL Editor from anywhere in the Supabase Dashboard [Link]
  • Read HubSpot data from within Postgres using the new FDW [Link]
  • Read Notion data from within Postgres using the new FDW [Link]

Made with Supabase

  • Stripe SaaS OSS with Supabase. Launch Full-Stack Apps 100x Easier.[GitHub] [YouTube]
  • Atomic CRM. The Open-Source CRM Toolkit for Personalized Solutions [GitHub] [Website]
  • GymBrah. Run your fitness business without chaos [GitHub][Website]
  • SQL Noir. Learn SQL by solving crimes [GitHub] [Website]

Community Highlights

  • Global Community Meetups [Sign up here]
  • The easiest way to get started selling SaaS with Polar [Repo]
  • Supabase 2025 Full Free Course [YouTube]
  • Build the Reddit Clone with Supabase [YouTube]
  • How to Use Cursor Agent and Supabase to Maximize Productivity [YouTube]
  • Multilingual transcription Telegram bot using Supabase and ElevenLabs Scribe API [Demo][YouTube]
  • Using Remotion Lambda with Supabase [Docs]

This discussion was created from the release Developer Update - February 2025.

You can now try Deno 2.1 locally with Supabase CLI. The goal of local preview is to identify any regressions or missing functionality before we upgrade hosted version to Deno 2.1.

The hosted version still uses Deno 1.4+, and if you deploy functions written with Deno 2.1, some of the features may not work.

UPDATE 04/09/25: hosted version is now using Deno 2.1 - https://github.com/orgs/supabase/discussions/37941

How to try


_10
[edge_runtime]
_10
deno_version = 2

  • All your existing functions should work as before.

*To scaffold a new function as a Deno 2 project:


_10
deno init --serve hello-world

  • Open supabase/config.toml and add the following:

_10
[functions.hello-world]
_10
entrypoint = "./functions/hello-world/main.ts"

  • Open supabase/functions/hello-world/main.ts and modify line 10 to:

_10
if (url.pathname === "/hello-world") {

Please give it a try and report any bugs and feedback.

We are happy to announce that we are simplifying our pricing further by greatly increasing our Third-Party Auth quotas and aligning them with our regular auth quotas.

Free PlanPro PlanTeam Plan
Before50 included50 included, then $0.00325 per MAU50 included, then $0.00325 per MAU
After50,000 included100,000 included, then $0.00325 per MAU100,000 included, then $0.00325 per MAU

With Third-Party Auth, Supabase allows you to use a different Auth provider alongside Supabase Auth and have an integrated experience, without being forced to adopt Supabase Auth straight away. This is especially useful if you'd like to start using Supabase with an existing project that uses a different auth provider.

This change is effective immediately.

Inline Editor (Feature Preview)

We're introducing a new inline editor that you may toggle through Feature Previews! 🙂 The inline editor can be used to run SQL wherever you are in the dashboard without taking you away from what you're doing. As usual, we'd love to hear what you think about this so please do feel free to share with us the good and (perhaps more importantly) the bad about this functionality right here in our GitHub discussions! 🙏 More details and information about this is also available there 😄

PR: https://github.com/supabase/supabase/pull/33541

Link: https://supabase.com/dashboard/project/_

Other bug fixes and improvements

General

  • Fix Assistant's usability for local / self-host environment (PR)

Auth

  • Add context menu support for users to copy email or delete user (PR)
  • Add link to open user logs in the Auth logs (PR)

Billing

  • Support adding additional billing email address (PR)
  • Add Logdrain Egress to Total Egress per day usage chart (PR)

Edge Functions

  • Redirect back to main page of edge functions if navigating to a slug that doesn't exist (PR)

Advisors

  • Remove unnecessary 2dp precision for ms values in Query Performance (PR)
  • Fix retrieving indexes in use for a selected SELECT query (PR)

Schema Visualizer

  • Support downloading current view as PNG (PR)

Storage

  • Fix List view footer showing up top instead of the bottom, covering the first item in the list (PR)

Logs

  • Allow users to click on a property of a selected log and add it to search (PR)
  • Remember last visited route in the Logs section (PR)

Reports

  • Add Supavisor connections to database reports (PR)

We have introduced two new API endpoints that allow you to deploy and update Edge Functions programmatically. This will be handy if you're building a Supabase integration or want to create an internal workflow without relying on Supabase CLI.

These are the same endpoints we use internally for Deploying Edge Functions from CLI without needing Docker and writing Edge Functions using AI assistant.

Deploy an Edge Function

This endpoint allows you to deploy a function by providing source files and metadata in a multipart/form-data body. You can also provide a function slug as the query parameter. If an existing function for the same slug exists, it will be updated; otherwise, a new function will be created.

You can pass the bundleOnly=1 query parameter to return the response metadata to the bundled function without persisting it. This is useful if you want to bulk update multiple functions atomically. Check the next section for the new bulk update endpoint.

API reference: https://supabase.com/docs/reference/api/v1-deploy-a-function

Example:


_10
curl --request POST \
_10
--url 'https://api.supabase.com/v1/projects/project-ref/functions/deploy?slug=my-func' \
_10
--header 'Authorization: Bearer sbp_TOKEN' \
_10
--header 'content-type: multipart/form-data' \
_10
--form 'metadata={ "entrypoint_path": "index.ts", "name": "My test" }' \
_10
--form file=@file

After the function is created, you can immediately invoke it:


_10
curl --request POST 'http://{project-ref}.supabase.co/functions/v1/my-func' \
_10
--header 'Authorization: Bearer SUPABASE_ANON_KEY' \
_10
--header 'Content-Type: application/json' \
_10
--data '{ "name":"Functions" }'

Bulk update Edge Functions

This endpoint allows you to update multiple Edge Functions atomically.

When deploying multiple edge functions, we recommend calling the deploy endpoint with the bundleOnly=1 query parameter, collecting the responses and then calling bulk update endpoint to update them atomically.

API reference: https://supabase.com/docs/reference/api/v1-bulk-update-functions

Write and run queries from anywhere in the dashboard

https://github.com/user-attachments/assets/5951fbb2-3db7-48e5-b4fa-7c83fbad3d2d

Today we are introducing a feature preview that gives you access to a new "inline editor". This is a SQL editor that is accessible from wherever you are in the dashboard. You can write or generate queries, run queries, view results, and save them to snippets if needed. It sits alongside other pages in the dashboard which makes it easy to reference tables, policies, functions, triggers etc while writing your queries.

As part of this preview we are also changing how you create and edit policies, triggers and functions to make use of this new inline editor. We often receive feedback around the existing UI based approach to creating policies, and our hope is that with the continued advancement of LLM's we can augment a plain SQL Editor with AI assistance vs needing a UI. The inline editor has access to an inline Assistant (press cmd + k while in editor) which will help you generate and modify queries.

To enable the feature preview, click your profile avatar, click feature previews and enable inline editor

What we'd like to know from you

  • When you use the inline editor, please tell us what you used it for, how the experience was, and what could be improved
  • For each use case, try the inline assistant (via cmd + k) and provide feedback on the results. We want to continue to iterate on our prompts so that accuracy is as good as it can be
  • How comfortable are you using an open editor with templates and AI guidance vs the existing UI based approach for policies and functions?
  • Any other ideas or suggestions

Build in a weekend, scale to millions