Shantanu's Blog

Database Consultant

April 28, 2025

 

Adding a word to dynamoDB table

If I need to add a word to dynamoDB table, I use the lambda function. The function URL looks like this...

https://z2zsnbwispdo5gh2z544bkblbe0amxfb.lambda-url.us-east-1.on.aws/?धर्माद

And the code is as follows:

import boto3
import urllib.parse

def lambda_handler(event, context):
    request_body = event['rawQueryString']
    print (request_body)

    dynamodb = boto3.resource('dynamodb')
    table = dynamodb.Table('sanskrit')
    key = { 'pk': urllib.parse.unquote(request_body)}
    table.put_item( Item=key )

    return {
        'statusCode': 200,
        'body': 'success'
    }

It saves the word "धर्माद" to the dynamoDB table "sanskrit".

Labels:


June 27, 2024

 

Disable dynamoDB table access

I can disable all access to a dynamoDB table using resource based policy. Here is an example:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Deny",
      "Principal": {
        "AWS": "*"
      },
      "Action": "dynamodb:*",
      "Resource": "arn:aws:dynamodb:us-east-1:XXX885053566:table/sandhiDupe"
    }
  ]
}

There are many other advantages of managing access at resource level.

Labels: ,


November 01, 2022

 

Check for open ports

This code will check if there is any port open and send an alert to the subscribers of SNS topic.

import boto3, json
ec2 = boto3.client('ec2' , region_name='us-east-1')
for security_group in ec2.describe_security_groups()['SecurityGroups']:
  for i in range(len(security_group['IpPermissions'][0]['IpRanges'])):
    for k,v in security_group['IpPermissions'][0]['IpRanges'][i].items():
        print (k, v)
        if '0.0.0.0' in v:
          message = {"alert": "open port found "}
          sns_client = boto3.client("sns", region_name="us-east-1")
          response = sns_client.publish(TargetArn='arn:aws:sns:us-east-1:102378362623:NotifyMe',
                                                                   Message=json.dumps({'default': json.dumps(message)}), MessageStructure='json')      

You may need to change the region name and SNS topic ARN address in the code mentioned above.
This code can be written as Lambda function and run every day.

Labels: , ,


October 28, 2020

 

Using Lambda Functions as UDF's in Redshift

Let's assume I have a list of client_codes saved in a redshift table and I need to find the details from an API.

# select client_code from some_table limit 10;
  client_code   |
--------------+
 1001 |  
 2002 |  
 9009 |  
 1009 |  
 1898 |  
 5465 |  
 3244 |  
 5576 |  
 4389 |  
 8756 |  
(10 rows)

I need to get the client addresses from a website. For e.g. the first client code is 1001 and address should come from
http://some_site.com/Details?dest=1001

This can not be done at SQL query level. You need to loop through an array using Python, PHP, Java etc. You can also write your scripts in AWS Lambda and use them as UDF (User Defined Functions) in Redshift. For e.g.

# select client_code, client_details(client_code) as c_address from some_table limit 10;
  client_code   |                  c_address
--------------+---------------------------------------------
 1001 | 21,Tamilnadu,
 2002 | 14,Madhya Pradesh & Chattisgarh,
 9009 | 7,Gujarat,
 1009 | 23,Uttar Pradesh (W) & Uttarakhand
 1898 | 11,Karnataka
 5465 | 3,Bihar & Jharkhand
 3244 | 11,Karnataka
 5576 | 6,Delhi
 4389 | 13,Kolkata
 8756 | 11,Karnataka
(10 rows)

The code of "client_details" Lambda function will look something like this...

import json
import requests
myurl = 'http://some_site.com/Details?dest='

def lambda_handler(event, context):
  ret = dict()
  res = list()
  for argument in event['arguments']:
      try:
        number = str(argument[0])
        page = requests.get(myurl+number[-10:])
        res.append((page.content).decode('utf-8'))
        ret['success'] = True
      except Exception as e:
        res.append(None)
        ret['success'] = False
        ret['error_msg'] = str(e)
      ret['results'] = res
  return json.dumps(ret)

Notes:
1) We are using "requests" module in this code. Since it is not available in AWS Lambda environment, I have added it using this layer...
# Layer: arn:aws:lambda:us-east-1:770693421928:layer:Klayers-python38-requests:9

2) You will also need to increase the timeout of Lambda upto 15 minutes. The API may take more than 3 seconds (default) to respond.

3) You will also have to update the IAM role associated with your Redshift cluster. (Actions - Manage Role) You can add the policy called "AWSLambdaFullAccess" or grant access to a single function as explained in the documentation.

The lambda function needs to be "linked" to Redshift using the "create function" statement like this...

CREATE OR REPLACE EXTERNAL FUNCTION client_details (number varchar )
RETURNS varchar STABLE
LAMBDA 'client_details'
IAM_ROLE 'arn:aws:iam::123456789012:role/RedshiftCopyUnload';

You need to change the IAM role name and the 12 digit account ID mentioned above in the IAM Role. 

You can now use your lambda function in your redshift query for e.g.

# select client_code, client_details(client_code) as c_address from some_table limit 10;

You can read more...

# https://aws.amazon.com/blogs/big-data/accessing-external-components-using-amazon-redshift-lambda-udfs/

Labels: , ,


April 07, 2020

 

emailThis service using serverless API

There are times when I find a great article or web page but don't have time to read. I use EmailThis service to save text & images from a website to my email inbox. The concept is very simple. Drag and drop a bookmarklet to the bookmark toolbar and click on it to send the current web-page to your inbox!

https://www.emailthis.me/

But I did not like the premium ads and partial content that the site sends. So I built my own serverless API to get exactly the same functionality using mailgun and Amazon Web Services.

https://www.mailgun.com/

Once you register with mailgun, you will get a URL and API-Key that you need to copy-paste to notepad. You will need to provide this information when you launch the cloudformation template by clicking on this link.


https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/new?stackName=emailThis&templateURL=https://datameetgeobk.s3.amazonaws.com/cftemplates/furl.yaml.txt

Once the resources are created, you can see a URL in output section something like this...

https://ie5n05bqo0.execute-api.us-east-1.amazonaws.com/mycall

Now building the javaScript bookmarklet is easy.

javascript:(function(){location.href='https://ie5n05bqo0.execute-api.us-east-1.amazonaws.com/mycall?email=shantanu.oak@gmail.com&title=emailThis&url='+encodeURIComponent(location.href);})();

Right click on any bookmark and then copy-paste the above link. Make sure that you have changed the URL and email address to your own. Now click on this bookmarklet while you are on an important web page that you need to send to your inbox. Enjoy!

Labels: , , ,


December 23, 2019

 

Use cloudformation macros in 3 steps

1) Deploy Lambda function, called "variableSubstitution":

import json

def variable_substitution(event, context):
    context = event['templateParameterValues']
    fragment = walk(event['fragment'], context)
    resp = {
      'requestId': event['requestId'],
      'status': 'success',
      'fragment': fragment
    }
    return resp

def walk(node, context):
    if isinstance(node, dict):
        return { k: walk(v, context) for k, v in node.items() }
    elif isinstance(node, list):
        return [walk(elem, context) for elem in node]
    elif isinstance(node, str):
        return node.format(**context)
    else:
        return node
       

2) Create a Macro called "vS" to refer to the Lambda Function:

Resources:
  CompanyDefaultsMacro:
   Type: AWS::CloudFormation::Macro
   Properties:
     Name: vS
     FunctionName: variableSubstitution

3) Use the Macro in Transform:

Transform:
  - vS
Parameters:
  stage:
    Type: String
    Default: dev
    AllowedValues:
      - dev
      - staging
      - prod
Resources:
  MySNSTopic:
    Type: AWS::SNS::Topic
    Properties:
      TopicName: "MyTopic-{stage}" # <-- look="" ma="" p="" python="" templating="">

Labels: , ,


November 24, 2019

 

sentiment analysis using twitter data

Here is a nice tutorial with cloudformation template to enable twitter sentiment analysis.

https://aws.amazon.com/blogs/machine-learning/exploring-images-on-social-media-using-amazon-rekognition-and-amazon-athena/

and here is another blog post to read text from images...

https://aws.amazon.com/blogs/machine-learning/building-an-nlp-powered-search-index-with-amazon-textract-and-amazon-comprehend/

If you want to import the data in elasticsearch instead of Athena as suggested in the article, you will need to add following code to lambda funciton.

import urllib3
http = urllib3.PoolManager()

host_senti = 'https://search-xxx.us-east-1.es.amazonaws.com/mysentiments/senti/'
host_enti = 'https://search-xxx.us-east-1.es.amazonaws.com/myentities/enti/'
host_rekon = 'https://search-xxx.us-east-1.es.amazonaws.com/myrekognitions/rekon/'

Add http.request method after firehose.put_record method.

        firehose.put_record(DeliveryStreamName=sentiment_stream, Record= { 'Data' :json.dumps(sentiment_record) + '\n'})
        http.request('POST', host_senti+str(tweet['id']), headers = {'Content-Type': 'application/json'}, body = json.dumps(sentiment_record) )

            firehose.put_record(DeliveryStreamName=entity_stream, Record= { 'Data' : json.dumps(entity_record) + '\n'} )
            http.request('POST', host_enti+str(tweet['id']), headers = {'Content-Type': 'application/json'}, body = json.dumps(entity_record) )

                firehose.put_record(DeliveryStreamName=rekognition_stream, Record= { 'Data' :json.dumps(image_rekognition_record) + '\n'})
                http.request('POST', host_rekon+str(tweet['id']), headers = {'Content-Type': 'application/json'}, body = json.dumps(image_rekognition_record) )

Labels: , ,


November 22, 2019

 

elastic indexing using python built-in module

Here is another way of inserting a document into elasticsearch database. This is same as using requests module or curl. The only difference is that it uses built-in module urllib3 that is part of AWS lambda

import json
import urllib3
http = urllib3.PoolManager()

host = 'https://search-test-xxx.us-east-1.es.amazonaws.com/myyl2/myt/myid'
some_data_structure={"test": "this is one doc"}

def lambda_handler(event, context):
  http.request('POST', host, headers = {'Content-Type': 'application/json'}, body = json.dumps(some_data_structure) )
 

Labels: , ,


September 22, 2019

 

Check EC2 security group for open ports

Here is a lambda function that will check if there is open port in a given security group.
It will send a message to sns topic if 0.0.0.0 is found anywhere in that security group.


def lambda_handler(event, context):
    import boto3, json
    ec2 = boto3.client('ec2' , region_name='us-east-1' )
    security_group = ec2.describe_security_groups(GroupIds=['sg-12345'])
    for i in range(100):
        try:
          for k, v in security_group['SecurityGroups'][0]['IpPermissions'][i]['IpRanges'][0].items():
            if '0.0.0.0' in v:
              print (k, v)
              message = {"alert": "open port found "}
              sns_client = boto3.client("sns", region_name="us-east-1")
              response = sns_client.publish(TargetArn='arn:aws:sns:us-east-1:12345:NotifyMe', Message=json.dumps({'default': json.dumps(message)}), MessageStructure='json')
        except:
          pass

Labels: , , ,


November 26, 2018

 

Assign IP to spot instance

This lambda function will assign the given IP address to the instance generated by fleet spot request.

def lambda_handler(event, context):
    from boto3 import client as boto3_client
    ec2_client = boto3_client('ec2')
    myip = '118.210.57.140'

    for i in ec2_client.describe_instances()['Reservations']:
        for x in i['Instances']:
            if x['Tags'][0]['Value'] == 'sfr-51319954-4575-4e22-815b-42':
                if x['State']['Name'] == 'running':
                    myr=x['InstanceId']
                    print (myr)  

    ec2_client.associate_address(InstanceId=myr, PublicIp=myip, AllowReassociation=True)

    return "IP assigned"

Labels: ,


October 04, 2018

 

Reboot an ec2 instance using boto

There are times when I need to restart an unresponsive server. I need to connect to console everytime. Here is how this simple task can be automated in 5 easy steps. I can now simply visit the tinyurl to restart the server.

1) make sure containers starts when docker start:
 docker update --restart=always xxx

2) make sure docker starts when server starts:
sudo systemctl enable docker
sudo chkconfig docker on

3) create a lambda function to reboot ec2 instance:
def lambda_handler(event, context):
    import boto3
    client = boto3.client("ec2", region_name="us-east-1")
    client.reboot_instances("i-xxx")

4) Deploy using API gateway
https://xxx.execut-api.us-east-1.amazonaws.com/Staging

5) create a tinyurl for the above URL
tinyurl.com/ipython-xxx

Labels: , ,


July 12, 2018

 

Monitor S3 bucket policy

All the S3 buckets should have a policy attached. Here is a lambda function that will check and notify if the policy is missing for any of the buckets. An alert will be published on SNS.

import json
from boto3 import client as boto3_client
lambda_client = boto3_client('lambda')
sns_client = boto3_client('sns')
s3_client = boto3_client('s3')
   
def lambda_handler(event, context):
    bl=list()
    for i in s3_client.list_buckets()['Buckets']:
        mybucket=i['Name']
        try:
            s3_client.get_bucket_policy( Bucket=mybucket)
        except:
            bl.append(mybucket)
   
    message = {"bucket_policy": bl}
    response = sns_client.publish(TargetArn='xxxx',
                                Message=json.dumps({'default': json.dumps(message)}),
                                MessageStructure='json')

    return ('error', bl)

A daily cron can be set using cloudwatch Events. Make sure that the rule is "enabled".

Labels: ,


October 08, 2017

 

Send SMS using Amazon

Here is a simple python script to send SMS.

vi sendsms.py

import boto3

# Create an SNS client
client = boto3.client(
    "sns",
    aws_access_key_id="XXX",
    aws_secret_access_key="XXX",
    region_name="us-east-1"
)

# Send your sms message.
client.publish(
    PhoneNumber="+91981XXXXX66",
    Message="Hello World aws again from docker!",
   MessageAttributes={
    'AWS.SNS.SMS.SMSType': {
      'DataType': 'String',
      'StringValue': 'Transactional'
    }
  }
)


And here is a docker-file, if you are not sure if your server will have python and boto3 module pre-installed.

vi Dockerfile

FROM python:2.7-alpine
RUN pip install boto3

WORKDIR /root/dev

CMD ["python"]

docker build . -t  shantanuo/myboto

# alias pancard='docker run -i --rm -v "$(pwd)":/root/dev/ shantanuo/myboto python sendsms.py  "$@"'

If you do not have a server where you can host your script and docker container, no problem. You can use Amazon Lambda function!

Labels: , , ,


August 28, 2017

 

Invoke Amazon lambda function directly

In most of the cases I access lambda function through API gateway. But there are times when I need to run the lambda function directly.

For e.g. if I have written a Amazon Lambda function to send a mail, I can use it from python as shown below:
from boto3 import client as boto3_client
lambda_client = boto3_client('lambda', region_name='us-east-1', aws_access_key_id='xx', aws_secret_access_key='xx')
x = {"title" : "test from lambda client invoke method", "email": "some.name@gmail.com"}
y=lambda_client.invoke(FunctionName="mymail", InvocationType='RequestResponse', Payload=json.dumps(x))
If you want to process the response of the function, then save the output to a variable like "y" and then use read method like this...
y['Payload'].read()
_____

You may need to create a new user with programatic access to lambda function. Attach this policy and generate access and secret keys.
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Action": ["lambda:InvokeFunction"],
      "Effect": "Allow",
      "Resource": "arn:aws:lambda:*:*:*"
    }
  ]
}
You can now share your access key with others who can invoke your lambda function from within another lambda function or code.

Labels: , ,


June 29, 2017

 

Amazon User Activity Log

Here is a good article about how to integrate cloud trail to elastic search.

https://blog.powerupcloud.com/visualizing-aws-cloudtrail-logs-using-elk-stack-e3d3b399af43

Once I follow all these steps, I can log all Amazon activities into Elastic. This is super useful to know what changes are taking place in my account.

Labels: , , ,


February 05, 2017

 

Import csv data file to DynamoDB

Here is 7 steps process to load data from any csv file into Amazon DynamoDB.

1) Create the pandas dataframe from the source data
2) Clean-up the data, change column types to strings to be on safer side :)
3) Convert dataframe to list of dictionaries (JSON) that can be consumed by any no-sql database
4) Connect to DynamoDB using boto
5) Connect to the DynamoDB table
6) Load the JSON object created in the step 3 using put_item method
7) Test

# Create the pandas dataframe from the source data

import pandas as pd
import boto3

df=pd.read_excel('http://www.tvmmumbai.in/Alumini%20Std.X-2013-2014.xls')

df.columns=["srno", "seat_no", "surname", "name", "father_name", "mother_name", "english", "marathi", "hindi", "sanskrit", "maths", "science","ss","best_of_5", "percent_best_of_5" , "total_out_of_6", "percent_of_600"]

# Clean-up the data, change column types to strings to be on safer side :)

df=df.replace({'-': '0'}, regex=True)
df=df.fillna(0)

for i in df.columns:
    df[i] = df[i].astype(str)

# Convert dataframe to list of dictionaries (JSON) that can be consumed by any no-sql database

myl=df.T.to_dict().values()

# Connect to DynamoDB using boto

MY_ACCESS_KEY_ID = 'XXX'
MY_SECRET_ACCESS_KEY = 'XXX'

resource = boto3.resource('dynamodb', aws_access_key_id=MY_ACCESS_KEY_ID, aws_secret_access_key=MY_SECRET_ACCESS_KEY, region_name='us-east-1')

# Connect to the DynamoDB table

table = resource.Table('marks1')

# Load the JSON object created in the step 3 using put_item method

for student in myl:
    table.put_item(Item=student)

# Test
response = table.get_item(Key={'seat_no': 'A 314216'})
response

Labels: , , , , ,


February 03, 2017

 

Create an API in a few minutes

1) Create a table in dynamoDB with the name "ssc" and use field "seat_no" as a string primary key.
2) Add the records for the subjects by creating fields "English", "Maths" and adding marks as string "54", "32". The primary key can be something like "B54MH".

3) Create python function in Lambda as shown below:

import boto3
import json

client = boto3.resource('dynamodb')
table = client.Table('ssc')

def lambda_handler(event, context):
    item = {'seat_no': (event['seatno'])}
    r=table.get_item(Key=item)  
    return json.dumps(r)

4) Create an API
Link the above lambda function name for e.g. "ssc" to the API gateway.
It is important to correctly specify the mapping for API get method execution request:

{"seatno": "$input.params('seatno')"}
_____

Once deployed, the URL will look something like this...

https://9xd7zbdqjg.execute-api.us-east-1.amazonaws.com/S1?seatno=B54MH

Note we are using secure server https to connect. This makes it possible to consume the API results directly into an application like slack.

It is also possible to include cutom authorizer using the steps outlined here...

http://docs.aws.amazon.com/apigateway/latest/developerguide/use-custom-authorizer.html

Labels: , , , , ,


January 29, 2017

 

find area and operator of a given number

The user will supply the telephone number like 9919838466 and we need to tell the operator and area. For e.g. airtel - delhi.

I have a python dictionary that has this key-value pairs.

mydic={'98198': 'vodafone-mumbai', '98199': 'airtel-delhi'}

This is 2 lines function that will extract the 5 digits and check it against the dictionary.

def findarea(mnumber, mydic):
    code=str(mnumber)[-10:][:5]
    return mydic[code]

I will test that the function is working as expected...

findarea('9819938466', mydic)
'airtel-delhi'

In order to use Amazon Lambda, I need to slightly modify the function. All the user inputs are passed as events dictionary. Therefore event['mnumber'] will capture the mnumber variable from the URL.

def lambda_handler(event, context):
    mnumber=event['text']
    mydic={'98198': 'vodafone-mumbai', '98199': 'airtel-delhi'}
    code=str(mnumber)[-10:][:5]
    final=mydic[code]
 
    yourdic = {
    "response_type": "in_channel",
    "text": final
}
    return yourdic

Name the Lambda function something like "slacktoapi".
_____

Now using api gateway create a new api called "areacode_api" that links to the function created above.

a) Method Execution - Get Method Request - URL Query String Parameters - add a variable call "text"
b) Method Execution - Get Integration Request - Mapping Templates - Content-Type - application/json - Mapping template

{"text": "$input.params('text')"}

Once you deploy this api, we get a URL and we can start using it like this...

https://91ovy8ax61.execute-api.us-east-1.amazonaws.com/s1?text=919819838463
_____

Here are 5 steps to add any Amazon API to your slack channel.

A) Click on "Build" button on App Directory page of slack.
B) choose "Make a Custom Integration" button
C) select Slash Commands
D) Create a new command called /dest2 and add the URL mentioned above
E) Choose GET Method and save the command.

Labels: , , , ,


April 28, 2016

 

pass all data to API gateway

It is possible to pass through the entire request body with a mapping template like this:

#set($inputRoot = $input.path('$'))
{
    "body" : $input.json('$')
}

If you wanted to pass in a subset of the request body change the '$' selector to the desired JsonPath.

http://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-mapping-template-reference.html

Labels: , ,


April 22, 2016

 

Pinterest to RSS feed discovery

I have written an API to generate RSS feed of any pinterest user and all the boards owned by him / her. This has been done using Amazon API gateway and AWS Lambda. Here is how it works...

The pinterest feed discovery will be available after supplying the pin url, somehting like this...

https://f84jmref3f.execute-api.us-east-1.amazonaws.com/stag/pintorss?url=http://in.pinterest.com/shantanuo/buy/

["http://in.pinterest.com/shantanuo/buy.rss", ["http://in.pinterest.com/shantanuo/feed.rss", "http://in.pinterest.com/shantanuo/my-wishlist.rss", "http://in.pinterest.com/shantanuo/truth.rss", "http://in.pinterest.com/shantanuo/tips.rss"]]

The first feed url returns the exact feed of "buy" board. There are other boards owned by the user "shantanuo". They are also returned as list.
_____

And here is the source code:

1) API Gateway:
a) Method Execution - Get Method Request - URL Query String Parameters - add a variable call url
b) Method Execution - Get Integration Request - Mapping Templates - Content-Type - application/json - Mapping template

{"url": "$input.params('url')"}

The above will link the variables received from the API gateway to the lambda function event dictionary.

2) Lambda function:

def lambda_handler(event, context):
    myurl=event['url']
    x=feedspot(myurl)
    return x.url_others()

The feedspot class can be copied from ...

https://gist.github.com/shantanuo/e6112e464276e4ccbc34c36620b811f8

Labels: , , ,


Archives

June 2001   July 2001   January 2003   May 2003   September 2003   October 2003   December 2003   January 2004   February 2004   March 2004   April 2004   May 2004   June 2004   July 2004   August 2004   September 2004   October 2004   November 2004   December 2004   January 2005   February 2005   March 2005   April 2005   May 2005   June 2005   July 2005   August 2005   September 2005   October 2005   November 2005   December 2005   January 2006   February 2006   March 2006   April 2006   May 2006   June 2006   July 2006   August 2006   September 2006   October 2006   November 2006   December 2006   January 2007   February 2007   March 2007   April 2007   June 2007   July 2007   August 2007   September 2007   October 2007   November 2007   December 2007   January 2008   February 2008   March 2008   April 2008   July 2008   August 2008   September 2008   October 2008   November 2008   December 2008   January 2009   February 2009   March 2009   April 2009   May 2009   June 2009   July 2009   August 2009   September 2009   October 2009   November 2009   December 2009   January 2010   February 2010   March 2010   April 2010   May 2010   June 2010   July 2010   August 2010   September 2010   October 2010   November 2010   December 2010   January 2011   February 2011   March 2011   April 2011   May 2011   June 2011   July 2011   August 2011   September 2011   October 2011   November 2011   December 2011   January 2012   February 2012   March 2012   April 2012   May 2012   June 2012   July 2012   August 2012   October 2012   November 2012   December 2012   January 2013   February 2013   March 2013   April 2013   May 2013   June 2013   July 2013   September 2013   October 2013   January 2014   March 2014   April 2014   May 2014   July 2014   August 2014   September 2014   October 2014   November 2014   December 2014   January 2015   February 2015   March 2015   April 2015   May 2015   June 2015   July 2015   August 2015   September 2015   January 2016   February 2016   March 2016   April 2016   May 2016   June 2016   July 2016   August 2016   September 2016   October 2016   November 2016   December 2016   January 2017   February 2017   April 2017   May 2017   June 2017   July 2017   August 2017   September 2017   October 2017   November 2017   December 2017   February 2018   March 2018   April 2018   May 2018   June 2018   July 2018   August 2018   September 2018   October 2018   November 2018   December 2018   January 2019   February 2019   March 2019   April 2019   May 2019   July 2019   August 2019   September 2019   October 2019   November 2019   December 2019   January 2020   February 2020   March 2020   April 2020   May 2020   July 2020   August 2020   September 2020   October 2020   December 2020   January 2021   April 2021   May 2021   July 2021   September 2021   March 2022   October 2022   November 2022   March 2023   April 2023   July 2023   September 2023   October 2023   November 2023   April 2024   May 2024   June 2024   August 2024   September 2024   October 2024   November 2024   December 2024   January 2025   February 2025   April 2025   June 2025   July 2025   August 2025  

This page is powered by Blogger. Isn't yours?