Shantanu's Blog

Corporate Consultant

July 12, 2015

 

python and sqlite

Here is how to connect and insert data into sqlite database using python.

import sqlite3

# create a database
# add a table with a few columns. one should be quantity
with sqlite3.connect("new.db") as connection:
    c = connection.cursor()
    c.execute("CREATE TABLE pizza(topping_1 TEXT, topping_2 TEXT,     quantity INT)")
    # add a row to the table
    c.execute("INSERT INTO pizza VALUES('pepperoni', 'mushrooms', 5)")

Labels: ,


June 28, 2015

 

min and max functions in python

I have a list of name, geo-location and population of 3 cities.

>>> cities
[['DENVER', [-104.98, 39.74], 634265], ['BOULDER', [-105.27, 40.02], 98889], ['DURANGO', [-107.88, 37.28], 17069]]

If I use max function, it will sort on city name and show the last value (BO, DE, DU)
>>> max(cities)
['DURANGO', [-107.88, 37.28], 17069]

In order to find the biggest city by population, I need to find max of second index value.
>>> max(cities, key=lambda city:city[2])
['DENVER', [-104.98, 39.74], 634265]

When you take this logic to geo-location, it can be more interesting.
The city of minimum longitude is the western most city.

>>> min(cities, key=lambda city:city[1])
['DURANGO', [-107.88, 37.28], 17069]

The simple built-ins like min and max can be so powerful at times!

Labels:


June 11, 2015

 

Quickly check the contents of a compressed csv file

If I quickly need to check the first 10 records of a 14GB compressed csv file saved on S3, here are just 3 python commands.

import iopro
adapter = iopro.s3_text_adapter('access_key', 'secret_key' , 'bucket_name' , 'folder/master_bkup.csv.gz', compression='gzip')
array = adapter[0:10]

This will return the first 10 records instantly. But if you ask for last 10 records adapter[-10:] then it will take a lot of time because it will have to download the entire 14 GB file.

You can also download the compressed file and open it in python.

import iopro
adapter = iopro.text_adapter('dlr_master_bkup.csv.gz', compression='gzip')
array = adapter[0:10]

Check out the iopro module available from anaconda:
http://docs.continuum.io/iopro/textadapter_advanced.html

May 11, 2015

 

Copy data from Redshift to MySQL

Here is python code that will connect to Redshift and pull the data into a dataframe. It will then copy the data to MySQL table. If the table exist, it will replace the table.

import easyboto
x = easyboto.myboto('xxx', 'yyy')mydf = x.runQuery("select *  from pg_table_def where schemaname = 'public'")
mydf.columns = ['schemaname', 'tablename', 'column_nm', 'type', 'encoding', 'distkey', 'sortkey', 'notnull']

import sqlalchemy
engine = sqlalchemy.create_engine('mysql://dba:dba@127.0.0.1/test')
mydf.to_sql('testdata', engine, if_exists='replace')

Labels: , , , ,


May 09, 2015

 

convert any excel file to MySQL using 5 lines of code

2 lines to open and read excel file.
2 lines to connect to mysql server.
1 line to dump data to mysql.
Life can not be easier than this without pandas and python ! :)   

                                            
import pandas as pd
df=pd.read_excel("5ch.xls", parse_dates='True', header=3)

import pymysql
conn = pymysql.connect(host='localhost', port=3306, user='dba', passwd='dba', db='test')

df.to_sql(con=conn, name='temp_aaa', if_exists='replace', flavor='mysql')
_____

You can ofcourse use pandas functions like melt, loc, concat or merge (and several others) before pushing data to mysql.

pd.melt(df, ["number", "reg_no", "st_name"], var_name=["c_six_to_ten"]).sort("reg_no")

df.loc[df['state'] == df['state'].shift(), 'state'] = ''

pd.concat([df,df1], ignore_index=True).drop_duplicates()
 
pd.merge(df, df1, on='InvDate')

Labels: , , ,


Archives

June 2001   July 2001   January 2003   May 2003   September 2003   October 2003   December 2003   January 2004   February 2004   March 2004   April 2004   May 2004   June 2004   July 2004   August 2004   September 2004   October 2004   November 2004   December 2004   January 2005   February 2005   March 2005   April 2005   May 2005   June 2005   July 2005   August 2005   September 2005   October 2005   November 2005   December 2005   January 2006   February 2006   March 2006   April 2006   May 2006   June 2006   July 2006   August 2006   September 2006   October 2006   November 2006   December 2006   January 2007   February 2007   March 2007   April 2007   June 2007   July 2007   August 2007   September 2007   October 2007   November 2007   December 2007   January 2008   February 2008   March 2008   April 2008   July 2008   August 2008   September 2008   October 2008   November 2008   December 2008   January 2009   February 2009   March 2009   April 2009   May 2009   June 2009   July 2009   August 2009   September 2009   October 2009   November 2009   December 2009   January 2010   February 2010   March 2010   April 2010   May 2010   June 2010   July 2010   August 2010   September 2010   October 2010   November 2010   December 2010   January 2011   February 2011   March 2011   April 2011   May 2011   June 2011   July 2011   August 2011   September 2011   October 2011   November 2011   December 2011   January 2012   February 2012   March 2012   April 2012   May 2012   June 2012   July 2012   August 2012   October 2012   November 2012   December 2012   January 2013   February 2013   March 2013   April 2013   May 2013   June 2013   July 2013   September 2013   October 2013   January 2014   March 2014   April 2014   May 2014   July 2014   August 2014   September 2014   October 2014   November 2014   December 2014   January 2015   February 2015   March 2015   April 2015   May 2015   June 2015   July 2015  

This page is powered by Blogger. Isn't yours?