Google Cloud SQL Tutorial - Database Deployment and Python

By NeuralNine

Cloud ComputingDatabase ManagementPython ProgrammingGoogle Cloud Platform
Share:

Cloud SQL & Python Database Setup on Google Cloud

Key Concepts:

  • Cloud SQL: Google’s fully-managed database service supporting MySQL, PostgreSQL, and SQL Server.
  • PostgreSQL: An open-source relational database system used in this tutorial.
  • pg_dump: A utility for backing up PostgreSQL databases into SQL script files.
  • psycopg2: The most popular PostgreSQL adapter for Python.
  • Public IP Address: The IP address used to connect to the Cloud SQL instance from external networks.
  • Authorized Networks: A security feature in Cloud SQL that allows specific IP addresses or ranges to connect to the instance.
  • SSL Mode (prefer): A security setting for the database connection that attempts to use SSL if available.
  • UV: A fast and simple package manager for Python.

1. Setting up a Cloud SQL Instance for PostgreSQL

The tutorial begins with creating a new project in the Google Cloud Console. This involves:

  • Logging into a Google account.
  • Clicking "New Project" and naming it (e.g., "cloud SQL tutorial").
  • Attaching a billing account to the project.
  • Selecting the newly created project.
  • Navigating to the Cloud SQL service.
  • Creating a new instance, utilizing the 30-day free trial.
  • Choosing PostgreSQL as the database engine.
  • Assigning a name to the instance (default is acceptable).
  • Setting a password for the postgres user (e.g., "cloud SQL tutorial").
  • Selecting a region (e.g., "Europe West 4" based on the user’s location).

The instance creation process takes time, during which the tutorial explores data migration options.

2. Data Migration from a Local PostgreSQL Database

The presenter demonstrates importing data from an existing local PostgreSQL database into the newly created Cloud SQL instance. This process involves:

  • Local Database Setup: A Docker Compose file is used to run a basic PostgreSQL database locally with a user (postgres), password (secret pass), and database name (some DB). The database contains sample data about people owning items.
  • Database Export (pg_dump): The pg_dump utility is used to export the local database into a SQL file (dump.sql). The command used is:
    docker compose exec db pg_dump -U postgres -F plain -O -A -d some_db > dump.sql
    
    • -U postgres: Specifies the user.
    • -F plain: Specifies the output format as plain SQL.
    • -O: Excludes owner information from the dump.
    • -A: Excludes access control lists (ACLs) from the dump.
    • -d some_db: Specifies the database to dump.
  • Google Cloud Storage Upload: The dump.sql file is uploaded to a Google Cloud Storage bucket.
    • A new bucket is created (e.g., "cloud SQL bucket test") in the EU multi-region.
    • The dump.sql file is uploaded to the bucket.
  • Data Import: The SQL file is imported into the Cloud SQL instance.
    • The target database is selected (PostgreSQL).

3. Exploring the Cloud SQL Studio

Once the data is imported, the tutorial demonstrates using the Cloud SQL Studio:

  • Accessing the studio through the Cloud SQL instance overview.
  • Authenticating with the postgres user and the password set during instance creation ("cloud SQL tutorial").
  • Executing SQL queries directly in the browser.
  • An example query is shown:
    SELECT people.name, items.name FROM people INNER JOIN items ON people.id = items.owner_id;
    
    This query retrieves the names of people and the items they own.
  • Mention of the Gemini code assistant for AI-powered query assistance (not demonstrated).

4. Connecting to Cloud SQL from Python

The tutorial then focuses on connecting to the Cloud SQL instance from a Python application:

  • Package Installation: The psycopg2 package is installed using uv (a package manager):
    uv init
    uv add psycopg2-binary
    
    Alternatively, pip install psycopg2-binary can be used.
  • Connection Establishment: A Python script is created to connect to the database using psycopg2.connect(). The connection parameters include:
    • host: The public IP address of the Cloud SQL instance (obtained from the instance overview).
    • user: postgres.
    • password: "cloud SQL tutorial".
    • dbname: postgres.
    • port: 5432 (default PostgreSQL port).
    • sslmode: prefer (attempts to use SSL if available).
  • Query Execution: A simple query is executed to retrieve data from the people table:
    import psycopg2
    
    conn = psycopg2.connect(host="<your_public_ip>", user="postgres", password="cloud SQL tutorial", dbname="postgres", port=5432, sslmode="prefer")
    cur = conn.cursor()
    cur.execute("SELECT * FROM people")
    print(cur.fetchall())
    cur.close()
    conn.close()
    
  • Network Authorization: The initial connection attempt fails due to network restrictions. The presenter demonstrates how to authorize the network by:
    • Navigating to the "Connections" section of the Cloud SQL instance.
    • Adding a network with the user’s public IP address (obtained from a website like myip.is) and the /32 subnet mask.

5. Additional Features & Conclusion

The tutorial briefly mentions advanced features available in the paid tier:

  • Replicas: For high availability and disaster recovery.
  • Backups: For data protection and recovery.

The presenter encourages viewers to like the video, subscribe to the channel, and explore one-on-one tutoring or services offered on their website.

Data & Statistics:

  • 30-day free trial of Google Cloud SQL.
  • Default PostgreSQL port: 5432.

Logical Connections:

The tutorial follows a logical progression: setting up the database instance, migrating data, exploring the database through the studio, and finally, connecting to it programmatically from Python. Each step builds upon the previous one, demonstrating a complete workflow for using Cloud SQL with PostgreSQL.

This summary provides a detailed and specific account of the YouTube video transcript, adhering to the requested guidelines of maintaining the original language, technical precision, and depth of information.

Chat with this Video

AI-Powered

Hi! I can answer questions about this video "Google Cloud SQL Tutorial - Database Deployment and Python". What would you like to know?

Chat is based on the transcript of this video and may not be 100% accurate.

Related Videos

Ready to summarize another video?

Summarize YouTube Video