Shantanu's Blog

Database Consultant

September 29, 2008


UNIX case study - 5

Read, change and consolidate files

MERGERECORDFILE=/root/mergerecord.$(date +%m%d%y)

# sed s/$/$'test'/g input.txt

Read the list of files saved in "branch_records.lst" file. Append the filename to the end of each line and save the entire text output to Merge Record file. The following example does the same thing using for - in loop.

MERGERECORDFILE=/root/mergerecord.$(date +%m%d%y)
RECORDFILELIST=(`ls i*.txt`)


If you want to add the field delimiter then add this...
and change the sed command to...


Another way to achieve the same result is to save the standard out to file descriptor.

$outfile = '/root/testinput.txt'
$INFILE = '/root/input.txt'
> $outfile
exec 4<&1
exec 1>$OUTFILE
for LINE in `cat $INFILE`
echo "$LINE"
exec 1<&4
exec 4>&-


September 25, 2008


UNIX case study - 4

Creating new file for each line

I have text file called "input" as below sample..
111 zzzz
222 ssss
3333 2222
1233 eeee

Im trying to do the below code.

to make a for loop to produce the below output.

grep "111 zzzz" FILE > 111
grep "222 ssss" FILE > 222
grep "3333 2222" FILE > 3333
grep "1233 eeee" FILE > 1233

in another word, I want the for loop to produce

grep "first_field second_field" FILE > first_field

I tried to start with something like

for x in ' more input`


grep $x FILE >


September 17, 2008


Copy table with or without data

There are number of ways to create / copy tables. It depends upon table type (InnoDB/MyISAM), indexes and records those you want to copy.

// copy paste the output to the new location
SHOW CREATE TABLE db2.oldtable

// create table without data exactly like old table (with indexes)
CREATE TABLE db1.newtable LIKE db2.oldtable

// create only table like old table (without indexes)
CREATE TABLE db1.newtable SELECT * FROM db2.oldtable WHERE 1=2

// create table without data like old table at command prompt (-d for without data)
[root@localhost test]# mysqldump db2 oldtable -d | mysql db1

// create table without indexes but with 100 records from old table
CREATE TABLE db1.newtable SELECT * FROM db2.oldtable LIMIT 100

// create table without indexes with all the records from old table
CREATE TABLE db1.newtable SELECT * FROM db2.oldtable

// The exactly idential tables (including indexes) in 2 statements
CREATE TABLE db1.newtable LIKE db2.oldtable;
INSERT INTO db1.newtable SELECT * FROM db2.oldtable;

// The exactly idential tables (including indexes) in one statement
CREATE TABLE db1.newtable (PRIMARY KEY (nation), KEY fsearchIp (searchip)) ENGINE = MEMORY SELECT * FROM db2.oldtable WHERE 1=2

// create table like old table on same or remote server
[root@localhost test]# mysqldump db2 oldtable | mysql -h -uroot -pPassWd db1

// Copy table testme from test database to remote database 'db' (MyISAM only)
[root@localhost test]# scp -p /var/lib/mysql/test/testme.* root@

// Copy table testme to testme1 in the same db (MyISAM only - when no one is using the table)
[root@localhost test]# cp -p testme.MYD testme1.MYD
[root@localhost test]# cp -p testme.MYI testme1.MYI
[root@localhost test]# cp -p testme.frm testme1.frm

// Copy table testme from test database to db database (MyISAM only)
[root@localhost test]# cp -p /var/lib/mysql/test/testme.* /var/lib/mysql/db/

## copy data directory after skipping a few tables
time rsync -av --filter="- /*/data_intermediate*.*" --filter="- /*/oa_session.*" --filter="- /*/ox_audit.*" /mysql/data/ /redis/data/mysql/

// Update the remote table with updates recorded in the bin logs
[root@localhost test]# mysqlbinlog --start-datetime="`date +%Y-%m-%d' '%H:00:00 -d\"1 hour ago\"`" --stop-datetime="`date +%Y-%m-%d' '%H:00:00`" mysql-bin.* | awk '/tbl_company/,/;/' | replace tbl_company mydb.tbl_company | mysql -h -uroot -pPasswd mydb

// Copy dbDRCGW1 Database to a new database shananu. Recreate the database shantanu if not exist
mysqladmin -uroot -pPassWord drop shantanu; mysqladmin -uroot -pPassWord create shantanu

time mysqldump -uroot -pPassWord -R --single-transaction dbDRCGW1 | perl /home/shantanu/ --delete | mysql -uroot -pPassWord shantanu
# file is used to remove the "definer" of stored procedures and triggers and can be found here...


Copy data from one server to another.

Sending server:
mysqldump db_name tbl_name | bzip2 -c | nc -l 1234

Receiving server:
nc 1234 | bzip2 -cd | mysql -uroot -proot@123 db_name


September 15, 2008


UNIX case study - 3

Running commands on remote host

# script: keyit.dsa

cat $HOME/.ssh/ | ssh $REM_HOST "cat >> ~/.ssh/authorized_keys"

# script: keyit.rsa

cat $HOME/.ssh/ | ssh $REM_HOST "cat >> ~/.ssh/authorized_keys"

If you want to set up the encryption keys for a new user, first su to that user ID,

and then issue one of the following commands.
To set up DSA keys issue this command:
ssh-keygen -t dsa

TO set up RSA keys issue this one:
ssh-keygen -t rsa

Read the ssh-keygen man page for more details: man ssh-keygen

// copy the rsa credentials to another server
ssh-copy-id -i  ~/.ssh/id_rsa root@

who, w, last and ps auxw commands give you detailed analysis of the current server state.

There is another way to make your logins passwordless temporarily. You need to add the following lines to the config file.

$ vi ~/.ssh/config
Host *
ControlMaster auto
ControlPath ~/.ssh/master-root@

Change the server IP in the above statement and login. Next time you will not be asked for any password and you can connect quickly.
This passwordless sesssion will continue till master server is logged in. Once you disconnect the first session in which you have typed the password, this facility will end.

If you do not want a prompt that would ask if you want to save the IP to the known hosts list, add the following as an alias.

alias ssh='ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null'

It can also be added to /etc/ssh/sshd_config so that all users can take advantage of this setting.


September 11, 2008


Unicode support in Knoppix 5.1

Knoppix Live CD has an excellent support for Unicode. It means it works with all languages in the world including Indian Languages and not just English. Here are the steps one needs to follow in order to use this CD.
1) Make sure you have Knoppix version 5.1.1 Live CD with you.

2) Insert the CD in the CD drive and start your computer.

3) If the computer starts in Windows then you have to hit F11 next time when you boot and change the boot sequence with "CD drive" as the first boot option.

4) Once you have started your computer in Knoppix environment, right click on the hard drive icon found on the desktop and choose the option "make the disk writable"

5) Now right click on the "US" flag on the task bar near the system clock.

6) Choose "India" in the country list and choose "Add". Click on apply, Ok.

7) Start open office writer and start typing in Devnagari. You can also save the document in .odf format since we have made the disk writable as explained in the step 4 above.

8) Restart the computer and remove the CD from the CD drive. Let the computer start in Windows. You can now open the .odf file that you have saved on the hard drive.
That's it. Enjoy.

Labels: ,

September 06, 2008


mysqldump time zone conflict

mysqldump changes the timestamp values based on timezones. I have a date May 28 in the table, but the dump file has changed it to 27 May.
You will need to add --skip-tz-utc parameter in order to keep the actual values as shown below...

mysql> select * from testTime;
| id | curdate1 |
| 889683 | 2008-05-28 02:59:56 |
1 row in set (0.00 sec)

mysql> exit
You have new mail in /var/spool/mail/root

[root@localhost test]# mysqldump test testTime
/*!40000 ALTER TABLE `testTime` DISABLE KEYS */;
INSERT INTO `testTime` VALUES (889683,'2008-05-27 21:29:56');

[root@localhost test]# mysqldump test testTime --skip-tz-utc
/*!40000 ALTER TABLE `testTime` DISABLE KEYS */;
INSERT INTO `testTime` VALUES (889683,'2008-05-28 02:59:56');
/*!40000 ALTER TABLE `testTime` ENABLE KEYS */;


September 04, 2008


mysqlimport and load data infile

You must have used load data infile to import data from a text file. mysqlimport does the same thing more eloquently and from command prompt.

You can use mysqlimport utility to import data from any text file. It's really worth a look.

mysql> select * from clients;
| id | age | name |
| 5 | 25 | shantanu |
| 2 | 23 | amar |
| 3 | 53 | akbar |
3 rows in set (0.00 sec)

mysql> select * into outfile 'clients' from clients;
Query OK, 3 rows affected (0.00 sec)

mysql> system mysqlimport test clients;
test.clients: Records: 3 Deleted: 0 Skipped: 0 Warnings: 0
mysql> select * from clients;
| id | age | name |
| 2 | 23 | amar |
| 5 | 25 | shantanu |
| 5 | 25 | shantanu |
| 2 | 23 | amar |
| 3 | 53 | akbar |
| 3 | 53 | akbar |
6 rows in set (0.00 sec)

Instead of exiting from mysql and using mysqlimport at UNIX prompt, I am using system command that allows me to escape from mysql and execute the commands as if I am at the command prompt.

We have created an outfile with the same name as target table. Now we can import the data from this text file by using the command mysqlimport. mysqlimport works like any other UNIX command and accept parameters like -d

-d, --delete First delete all rows from table.
-f, --force Continue even if we get an sql-error.
-i, --ignore If duplicate unique key was found, keep old row.
-l, --lock-tables Lock all tables for write.
-r, --replace If duplicate unique key was found, replace old row.

And the usual load data infile parameters as well.

Fields in the textfile are terminated by ...
Fields in the importfile are enclosed by ...
Fields in the i.file are opt. enclosed by ...
Fields in the i.file are escaped by ...
Lines in the i.file are terminated by ...


September 03, 2008


UNIX case study - 2

I'm writing a script that provides a menu to manipulate records. I'm having a problem, however. The first option I'm writing is simply to display all of the records in a supplied file.

The supplied file ('records') currently contains sample data, this data is:

95671660:Jones:Sarah:45:sales manager
93272658:Smith:John:43:technical manager
98781987:Williams:Nick:35:computer officer

I need to display this data as standard output. It needs to be arranged in 5 columns, each left-justified. The column order is family name (i.e. Jones), first name, telephone number , department number (i.e. 45) and job title. In needs to be sorted in ascending alphabetical order of family name (i.e. Anderson, Brown etc.)

Basically, it should look (something) like this:


Anderson Sarah 95437869 19 CEO
Brown Sarah 99893878 12 electrician
Couch David 95673456 26 chef
Jones Sarah 95671660 45 sales manager
Smith John 93272658 43 technical manager
Williams Nick 98781987 35 computer officer

I've been trying various solutions using cut and gawk. It's an sh script. Honestly, though, it's getting stupid. Look:


# Print records option.
1) echo -e "Debug: User has selected print option.\n"
#cut -f2 -d: records | sort
awk '$2 $3 $1 $4 $5 {print}' records
#cut -f3 -d: records
#cut -f1 -d: records
#cut -f4 -d: records
#cut -f5 -d: records
echo -en "\nPress Enter to continue... "

It's pretty obvious I haven't a clue how to use (g)awk properly, lol. I'm used to using cut for extracting single columns. I've never had to extract and format a table like this, using unix.

Can anyone help?



June 2001   July 2001   January 2003   May 2003   September 2003   October 2003   December 2003   January 2004   February 2004   March 2004   April 2004   May 2004   June 2004   July 2004   August 2004   September 2004   October 2004   November 2004   December 2004   January 2005   February 2005   March 2005   April 2005   May 2005   June 2005   July 2005   August 2005   September 2005   October 2005   November 2005   December 2005   January 2006   February 2006   March 2006   April 2006   May 2006   June 2006   July 2006   August 2006   September 2006   October 2006   November 2006   December 2006   January 2007   February 2007   March 2007   April 2007   June 2007   July 2007   August 2007   September 2007   October 2007   November 2007   December 2007   January 2008   February 2008   March 2008   April 2008   July 2008   August 2008   September 2008   October 2008   November 2008   December 2008   January 2009   February 2009   March 2009   April 2009   May 2009   June 2009   July 2009   August 2009   September 2009   October 2009   November 2009   December 2009   January 2010   February 2010   March 2010   April 2010   May 2010   June 2010   July 2010   August 2010   September 2010   October 2010   November 2010   December 2010   January 2011   February 2011   March 2011   April 2011   May 2011   June 2011   July 2011   August 2011   September 2011   October 2011   November 2011   December 2011   January 2012   February 2012   March 2012   April 2012   May 2012   June 2012   July 2012   August 2012   October 2012   November 2012   December 2012   January 2013   February 2013   March 2013   April 2013   May 2013   June 2013   July 2013   September 2013   October 2013   January 2014   March 2014   April 2014   May 2014   July 2014   August 2014   September 2014   October 2014   November 2014   December 2014   January 2015   February 2015   March 2015   April 2015   May 2015   June 2015   July 2015   August 2015   September 2015   January 2016   February 2016   March 2016   April 2016   May 2016   June 2016   July 2016   August 2016   September 2016   October 2016   November 2016   December 2016   January 2017   February 2017   April 2017   May 2017   June 2017   July 2017   August 2017   September 2017   October 2017   November 2017   December 2017   February 2018   March 2018   April 2018   May 2018   June 2018   July 2018   August 2018   September 2018   October 2018   November 2018   December 2018   January 2019   February 2019   March 2019   April 2019   May 2019   July 2019   August 2019   September 2019   October 2019   November 2019   December 2019   January 2020   February 2020   March 2020   April 2020   May 2020   July 2020   August 2020   September 2020   October 2020   December 2020   January 2021   April 2021   May 2021   July 2021   September 2021   March 2022   October 2022   November 2022   March 2023   April 2023   July 2023   September 2023   October 2023   November 2023   April 2024   May 2024   June 2024  

This page is powered by Blogger. Isn't yours?