skip to Main Content

I was following the Connecting to Cloud SQL with Cloud Functions tutorial but my Cloud Function won’t connect to the PostgreSQL database.

Here’s my function code:

import sqlalchemy

connection_name = "redacted-1234a:asia-northeast3:myinstance2"

query_string =  dict({"unix_sock": "/cloudsql/{}/.s.PGSQL.5432".format(connection_name)})

def insert(request):
    print(f"Started function - query_string: {query_string}")
    request_json = request.get_json()
    stmt = sqlalchemy.text('insert into {} {} values {}'.format("entries", "(guestName, content)", "('cloud hello', 'Got here!')"))
    
    db = sqlalchemy.create_engine(
      sqlalchemy.engine.url.URL(
        drivername="postgres+pg8000",
        username="postgres",
        password=redacted,
        database="guestbook",
        query=query_string,
      ),
      pool_size=5,
      max_overflow=2,
      pool_timeout=30,
      pool_recycle=1800
    )
    print("Created engine")
    try:
        with db.connect() as conn:
            print("Connected to engine")
            conn.execute(stmt)
    except Exception as e:
        return 'Error: {}'.format(str(e))
    return 'ok'

When I run it from the "Testing" tab, I get:

2023-12-20 10:58:06.728 JST - Started function - query_string: {'unix_sock': '/cloudsql/nownow-8907a:asia-northeast3:myinstance2/.s.PGSQL.5432'}
2023-12-20 10:58:09.054 JST - Created engine
Error: (pg8000.core.InterfaceError) ('communication error', FileNotFoundError(2, 'No such file or directory'))
(Background on this error at: http://sqlalche.me/e/rvf5)

Here’s what I ran in Cloud Shell when setting up the database:

$ gcloud sql connect myinstance2 --user=postgres
Allowlisting your IP for incoming connection for 5 minutes...done.                                                                                                                                      
Connecting to database with SQL user [postgres].Password: # typed in redacted
psql (16.1 (Debian 16.1-1.pgdg110+1), server 15.4)
SSL connection (protocol: TLSv1.3, cipher: TLS_AES_256_GCM_SHA384, compression: off)
Type "help" for help.
postgres=> CREATE DATABASE guestbook;
CREATE DATABASE
postgres=> connect guestbook;
Password: # typed in redacted
psql (16.1 (Debian 16.1-1.pgdg110+1), server 15.4)
SSL connection (protocol: TLSv1.3, cipher: TLS_AES_256_GCM_SHA384, compression: off)
You are now connected to database "guestbook" as user "postgres".
guestbook=> CREATE TABLE entries (guestName VARCHAR(255), content VARCHAR(255), entryID SERIAL PRIMARY KEY);
CREATE TABLE

How can I get this to work? I feel like I’ve checked everything:

  • The function is run with a service account that has the Cloud SQL Client role
  • redacted is the password for both postgres itself and the database (I am able to connect using it via Cloud Shell)
  • The PostgreSQL instance has both Public IP and Private IP with "private path" enabled
  • The SQL > Connections > Security has Google Cloud services authorization and App Engine authorization "Enabled", and "Allow only SSL connections" and "Require trusted client certificates" set to "Disabled"

Another interesting anomoly with the cloud function is I verified 3 times during creation that "Allow unauthenticated invocations" was selected, but after creating the function it went back to "Require authentication". I don’t think this is super relevant though, since I’m still able to run the function, I just can’t connect to the database.

2

Answers


  1. Chosen as BEST ANSWER

    Strangely, it wasn't part of the Collab I followed, but you also need to follow these steps here: https://cloud.google.com/sql/docs/mysql/connect-functions#configure

    You'll want to carefully follow that page (in case it changes), but the general steps are:

    • Go to the Cloud Run page
    • Go to, then edit your function
    • In the Container(s) tab, find and add your connection name to Cloud SQL connections

    After deploying these changes, your function will now have the ability to connect to the SQL database


  2. I’d recommend using the Cloud SQL Python Connector library. It will manage connecting to your instance with a secure TLS/SSL connection and give you the ability to use automatic IAM authentication along with other benefits. It will also allow you to choose which IP type to connect to (Public vs Private) as the above unix socket does not make this clear.

    It also allows you to connect without needing to do the additional step you mentioned:

    In the Container(s) tab, find and add your connection name to Cloud SQL connections

    An example Cloud Function would look as follows:

    from google.cloud.sql.connector import Connector, IPTypes
    import sqlalchemy
    import requests
    
    connector = None
    db = None
    
    def init_pool(connector):
      def getconn():
        connection = connector.connect(
          "<YOUR_PROJECT>:<YOUR_REGION>:<YOUR_INSTANCE_NAME>",
          "pg8000",
          user="<YOUR_USER>",
          password="<YOUR_PASSWORD>",
          db="<YOUR_DB>",
          ip_type=IPTypes.PUBLIC,  #IPTypes.PRIVATE for Private IP
        )
        return connection
    
      # create connection pool
      engine = sqlalchemy.create_engine("postgresql+pg8000://", creator=getconn)
      return engine
    
    def hello_world(request):
      global connector, db
      if not db:
        connector = Connector()
        db = init_pool(connector)
      # build connection for db using Python Connector
      with db.connect() as conn:
        result = conn.execute(sqlalchemy.text("SELECT NOW()")).fetchone()
      print("Successfully connected to Cloud SQL instance!")
      return str(result[0])
    

    The reason we lazy init the connection pool is so it can be leveraged across multiple requests (the code in the description creates a new connection pool for every request).

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search