Connect to Google Cloud SQL using Jupyter Notebooks on Google AI Platform

Alexander Ekdahl
2 min readMar 8, 2021

--

This is a simple tutorial to connect your Jupyter Notebook running on Google Cloud to Google Cloud SQL. This is actually very easy to do, but it may be hard to find the documentation on how to do it.

Good news first, there is no need for any SQL proxy to connect your notebook. There are other applications that requires proxies, such as Kubernetes and App Engine, but notebooks on the AI Platform works fine with an authorised IP address.

Essentials

You need the following Cloud SQL credentials to connect your notebook.

  1. Database username
  2. Database password
  3. Database host (IP address)
  4. Database name

Authorise the Notebook servers IP-address

If you’re running your notebook on the AI Platform, it is actually running on a compute engine in the same project. So head over to your compute engines and fetch the external IP address for the compute engine. Then head back to your Cloud SQL instance and add the IP address to the authorised network. If you want to avoid this action on every restart of the compute engine, assign a static IP address to the compute engine.

Connect

You’re all set! Connect using the package of your choice, below is an example with SQL Alchemy and pymsql.

import sqlalchemy, pymysqldb = sqlalchemy.create_engine(
sqlalchemy.engine.url.URL(
drivername=’mysql+pymysql’,
username=DB_USER,
password=DB_PASS,
host=DB_HOST,
database=DB_DB
)
)

Then run SQL as usual

with db.connect() as conn:
qry = "SELECT * FROM some_table"
results = conn.execute(qry).fetchall()
for row in results:
...

I hope this is helpful for someone trying to quickly sort out how to connect the notebook to GCP Cloud SQL.

--

--

Alexander Ekdahl
Alexander Ekdahl

Written by Alexander Ekdahl

AI and E-commerce enthusiast. Founder of zubi.ai. We aim to help the world make use of its data.

No responses yet