SQLAlchemy has made it easy to read and write data from databases. I am using Google Colab and free databases to store and manipulate data. In this article, I will read data from MySQL database and save the result to a PostgreSQL using Pandas dataframe to_sql method.
Steps
- read data from MySQL database
- save data to PostgreSQL database
Prerequisites
Needed following Python packages – Google Colab runtime comes with SQLAlchemy and Pandas preinstalled.
- pymysql
- psycopg2
- pandas
- sqlalchemy – create_engine, text
Reading data
import pandas as pd
from sqlalchemy import create_engine, text
engine = create_engine("mysql+pymysql://userid:password@db4free.net/dbname")
conn = engine.connect()
# use query
query = text('SELECT * FROM table')
df = pd.read_sql_query(query, conn)
engine.dispose()
Writing data
from sqlalchemy import create_engine
CONNSTR = f'postgresql://userid:password@hostname/dbname?options=endpoint%3Dproject_id'
engine = create_engine(CONNSTR)
# save data using pandas to_sql
with engine.begin() as connection:
df.to_sql(
name='table',
con=connection,
if_exists='replace',
index=False,
)
# close all connection
engine.dispose()