Friday, October 25, 2013

How to Create Custom AMI for Amazon EC2

In this article we will explain how to create and register your own custom AMI from your own dedicated server or virtual server external to the Amazon EC2 Cloud.
An AMI (Amazon Machine Image) is a preconfigured image of an operating system used to launch instances in Amazon EC2.
Suppose we have a dedicated or virtual server and want to try a new project in the Amazon EC2 cloud. Suppose also that we want to use a replica of our server to test this new application, since we have all the software we need installed.
Well, you must first install the api and ami tools. They are not in the Centos repositories, but you can download them from these links:
In Ubuntu you can install directly with the package manager:
:$ aptitude install ec2-ami-tools ec2-api-tools

Building the AMI para Amazon EC2.

First of all, we must get our Amazon account X509 certificate composed of a private key and certificate.
We can create this certificate from our own profile in the “Security Credentials” in the menu on the left, and it allows us to download the files for the corresponding private key and the certificate, both in PEM format.
As expected, if we build an AMI with a certificate and we lose the certificate, when we  generate a new certificate, our AWS account will be associated with the new certificate. Thus our AMI will not belong to our aws account any longer, although it is hosted in our own S3 account. So be carefull!
To create the custom image from our server to launch Amazon EC2 instances, run the following:
:$ ec2-bundle-vol -k pk-****************.pem -c cert- *******.pem -u ****-*****-**** -r x86_64 –no-inherit -d /mnt/ami/
-k is the path to the file containing to the key.
-c is the path to the file containing the certificate.
-r type of architecture, 64 bits in our case.
–no-inherit  to not  inherit metadata of an instance of Amazon EC2.
-u is the AWS account number, NOT the Access Key ID.
The above command will generate in the directory /mn/ami the image of our server divided into volumes:
root@localhost:~# ls -lsa /mnt/ami/
total 2165680
4 drwxr-xr-x 2 root root 4096 jun 13 15:57 .
4 drwxr-xr-x 4 root root 4096 jun 13 15:51 ..
1593432 -rw-r--r-- 1 root root 8589934592 jun 13 15:53 image
12 -rw-r--r-- 1 root root 9370 jun 13 15:57 image.manifest.xml
10240 -rw-r--r-- 1 root root 10485760 jun 13 15:56 image.part.00
10240 -rw-r--r-- 1 root root 10485760 jun 13 15:56 image.part.01
10240 -rw-r--r-- 1 root root 10485760 jun 13 15:56 image.part.02
10240 -rw-r--r-- 1 root root 10485760 jun 13 15:56 image.part.03

Upload AMI to Amazon S3.

With this command we upload the image to the bucket of our Amazon S3 account:
:$ ec2-upload-bundle -b bucketname -m /mnt/ami/imagen.manifest.xml -a ************* -s *****************
-b is the name of backet. If it does not exist, it is created.
-m is the path to the xml file generated when we built the image.
-a is the “acces key id” of our aws account.
-s is the “Secret Access Key” of our aws account.
Both the “Access key id” as the “Secret Access Key” can be found in the profile of our account in the “Security Credentials” from the menu on the left.
When image uploading is complete, we can  see in our Amazon S3 account the image divided into volumes along with the xml file.

Register the custom AMI.

Once uploaded our custom AMI, you can register it to launch instances based on it.
To register the image, just access the EC2 service from our account, and click the section “IMAGES – AMIs”. In the top center of the screen is the option “Register New AMI”. If you click it, a form will be displayed in wich we can indicate the URL of our AMI.
In our case,  it is bucketname/image.manifest.xml.
Once added we can launch instances with this image.
You can also register AMIs with this command from console:
:$ ec2-register ftbamijapon/image.manifest.xml -a x86_64 -K pk-*************.pem -C cert-***********.pem --region ap-northeast-1
-a 32 or 64-bit architecture (64 in our case).
-K is the path to the file containing the key.
-C is the path to the file containing the certificate.
–region in which region or area of Amazon we want to register our custom AMI on Amazon EC2.

Things to consider

ZONE OR AMAZON REGION

It is very important to choose the same area on the Amazon S3 as in Amazon EC2. If they are in different areas the price of traffic sent to load our AMI at the time of launching an instance, will be more expensive than they are in the same area.
The same is true of two instances of Amazon EC2 that send data to each other, for example, a database server and web server.

KERNEL CHOICE

Another thing to consider is the choice of kernel at AMI registration time. If the appropriate one was not selected, our Amazon EC2 instance will not start.

Monday, August 12, 2013

Online Classes for Learning AWS Administration

Heard about Cloud Computing? Or Cloud Computing with Amazon Web Services?
You're at right place to introduce yourself with AWS. Not with words, but with actions.

I am an open source enthusiastic. I have been working with Linux and UNIX for over 8 years. Currently working as AWS architect and solution provider.
I have come across many people, from individual systems administrator to small and medium entrepreneurs. All I have seen is that people still don't know what is AWS Cloud Computing and how it works. In simple words, most of the individuals lack one thing - "How AWS Cloud Computing works in real time!!!"
I have made an attempt to get everything in front of you with LIVE online practical sessions. So that you don't only learn about AWS but also see how various services can be configured and integrated with each other. You can get lot of text literature online, but you will not get these videos under one roof.
Please visit here for class details.

Here is the list of topics I will be covering in this class:
  • Creating Virtual Servers in Cloud
  • Configuring Storage in Cloud
  • Deploy Content Delivery Network (CDN) in Cloud
  • Deploy Load Balancers in Cloud
  • Automatically launch new Virtual Servers in Cloud using AutoScaling
  • Monitor your services and get alert notifications on email and SMS.
  • Configure Highly Available DNS system in the Cloud.
I will be conducting LIVE sessions every weekend where you can ask questions directly, clear your doubts and if time permits, I will walk over one specific topic which is there in the course as well.

So let's get started guys. If you learn this course seriously, then you can enhance your career growth since Cloud Computing is the next big thing in IT industry.

This course has been designed especially for the beginners. So have your share and start learning Cloud Computing with AWS.. RIGHT NOW.

Please enroll yourself here to know more.

Wednesday, August 7, 2013

Online Video Live Training Session

Its been a while now since I have started training people on various technical skills. Although online education has been widely accepted, still people carry a fear whether the class material will be good or instructor will be having sufficient knowledge, etc.
To overcome this fear and make online eduction a better place to learn, I have started conduction 1x1 video sessions, where student can see the instructor and vice-versa, ask any questions that eventually leads to a great discussions and learning becomes fun.
Please join the online education community and spread the knowledge.

Please visit here to know more.

Manage AWS S3 buckets with Python - Part 2

Here is some more detail on working with S3 using Python.

1. Create a bucket in non-default location.

from boto.s3.connection import Location
#Display all available regions
print '\n'.join(i for i in dir(Location) if i[0].isupper())
#Make connection to S3
s3_conn=boto.connect_s3()
#Create bucket in South-East Asia region
s3_conn.create_bucket('nix-bucket091', location=Location.SAEast)
2. Upload a file to S3
from boto.s3.key import Key
#Create bucket object
bucket_name=s3_conn.get_bucket('nix-bucket091')
#Create Key object
s3_key=Key(bucket_name)
#Set key name attribute
s3_key.key='myfile'
#Create the object of the file you want to upload
fp=open('d:/take100.txt','r')
#Transfer the file
s3_key.set_contents_from_file(fp)
3. List all files in a bucket
#Create bucket objcet
mybucket=s3_conn.get_bucket('nix-bucket1981')
#Loop through all the files in the bucket
for bList in mybucket.list():
     print bList.name.encode('utf-8')

To learn Cloud Computing with AWS, Please get yourself enrolled here. This course is only for $10 with lifetime access of the material.

Sunday, August 4, 2013

Manage AWS S3 buckets with Python - Part 1

There are many ways to manage your AWS account. For ease of administration, AWS provides a very decent web management console to administer most of the services. However, if you are a systems administrator then you must be looking for something which automates most of your admin tasks and makes your life easy.
There are various APIs released by AWS for most of the widely used programming languages. This article will focus on Python integration with AWS.
In this series of articles, I will walk you through most of the services that you can manage using Python.

Before we start, here is a list of pre-requisites:
1. You must have AWS account for practice. Don't worry, you can Free Web Tier for couple of months. Just don't exceed the limits. For more details, please visit AWS official website.
2. Basic understanding of Python. There are many free tutorials on YouTube, Pyton official website, etc.
3. Basic understanding of how AWS services work. Here is an online video tutorial covering most of the AWS services. It will give you good understanding of how stuff works in AWS. You can get it for ONLY $10

Let's get to work now.

There is a standard Python SDK for AWS, which is called boto. You need to install it on your system before you start working on any of the AWS services. Here is the command to install boto.
pip install boto
 Once boto installation will be completed, follow these steps to start working with AWS S3.

1. Create /etc/boto.cfg file and add AWS AccessKeyId and SecretAccessKeyId
2. If you don't want to add credentials in a file, then you have to explicitly define them in the python code (not recommended).

3. Connect with S3 and display existing buckets:
import boto
s3_conn=boto.s3_connect()
s3_conn.get_all_buckets()
 Above code connects with your S3 account and display list of all the existing buckets in default region. The default region is us-east-1.

4. Now, lets create a new bucket. I will first check if the bucket already exists or not.
bucket_name='nix-bucket81'
bucket=s3_conn.lookup(bucket_name)
if bucket:
      print "Bucket %s exists" % bucket_name
else:
     s3_conn.create_bucket(bucket_name)
 5. Finally, delete the bucket.
bucket_name='nix-bucket81'
bucket=s3_conn.lookup(bucket_name)
if bucket:
     print "Deleting bucket %s" % bucket_name
     s3_conn.delete_bucket(bucket_name)
else:
     print "Bucket does not exist"

Thursday, August 1, 2013

AWS Launched Edge Locations in India

Good News for Indian users.

AWS finally launched its edge locations in India. Right now Chennai and Mumbai have been selected to serve Indian customers. This is really an exciting news for AWS lovers in India.

As per announcement made by AWS, the cloud giant has introduced Route53 and CloudFront services as of now. It helps India customers to build highly available DNS system and experience low latency. CloudFront will help in configuring CDN for your website / other services you have configured on AWS. CloudFront really boost performance when your services will be accessed within India.

Another advantage for Indians is cost saving. Right now Indian users have been routed to Singapore edge nodes which are costlier as compared to edge nodes launched in India.

We really hope that AWS will introduce other services too in India. Since India is a big market and there are billions of users accessing AWS services, it will be win-win situation for both AWS and Indian users.

Saturday, July 27, 2013

Cloud Computing with AWS using LIVE sessions

Here comes the LIVE classes onu AWS. These classes consist of 1:1 interaction, which means you can clear your doubts instantly, engaged with tutor on a specific topic and get most knowledge from  industry experienced tutor.

Please visit here to check the classes' schedule. For your convenience, all the classes will be taught in the weekend.

So what are you waiting for? MARK YOUR CALENDAR and let's start learning something really really valuable.
 

Let us know your interest for online learning

To deliver the best and appropriate online course, I need your one minute. I would like you to select topic mentioned below. Any topic with highest votes will be publshed online next month.

Learn Cloud Computing with AWS FREE - Only 10 days left

I'm offering AWS Cloud Computing FREE for next 10 days. If you are looking to learn AWS, please get yourself enrolled here

Friday, July 26, 2013

Cloud Computing with AWS - Only For $10

Heard about Cloud Computing? Or Cloud Computing with Amazon Web Services?
You're at right place to introduce yourself with AWS. Not with words, but with actions.

First things first - Click here to enroll at discounted price - http://bit.ly/1boqq6A
I am an open source enthusiastic. I have been working with Linux and UNIX for over 8 years. Currently working as AWS architect and solution provider.
 
I have come across many people, from individual systems administrator to small and medium entrepreneurs. All I have seen is that people still don't know what is AWS Cloud Computing and how it works. In simple words, most of the individuals lack one thing - "How AWS Cloud Computing works in real time!!!"

I have made an attempt to get everything in front of you with online practical sessions. So that you don't only learn about AWS but also see how various services can be configured and integrated with each other. You can get lot of text literature online, but you will not get these videos under one roof.

You will get lifetime access to these videos. Watch at your own pace and learn every bit of it. Here is the list of topics I have covered in the course:
  • Creating Virtual Servers in Cloud
  • Configuring Storage in Cloud
  • Deploy Content Delivery Network (CDN) in Cloud
  • Deploy Load Balancers in Cloud
  • Automatically launch new Virtual Servers in Cloud using AutoScaling
  • Monitor your services and get alert notifications on email and SMS.
  • Configure Highly Available DNS system in the Cloud.
And there is more. I will be conducting LIVE sessions every weekend where you can ask questions directly, clear your doubts and if time permits, I will walk over one specific topic which is there in the course as well.

So let's get started guys. If you learn this course seriously, then you can enhance your career growth since Cloud Computing is the next big thing in IT industry.

This course has been designed especially for the beginners. So have your share and start learning Cloud Computing with AWS.. RIGHT NOW.


 

Wednesday, July 24, 2013

Create Virtual Server in cloud using Amazon EC2

Please enroll here for more videos: http://bit.ly/13WcbMB

This course talks about basics to intermediate services offered by AWS, like configure virtual servers using EC2, setup storage using S3 and EBS, archival systems using Glacier, monitoring solution with CloudWatch, CDN setup using cloudfront, DNS system setup with Route53, configure virtual cloud using VPC, send notifications using SNS, and much more.
I have prepared video tutorials for each of these services along with LAB sessions for practice. I believe if you complete this course successfully, you will become a professional in AWS infrastructure management.


Amazon Route 53 Announces CloudWatch Metrics for Health Checks and DNS Failove

AWS has announced the release of Amazon CloudWatch metrics for Route53 health checks. Starting July 9th 2013, you can use CloudWatch to view the status of your Route53 health checks, and you can set CloudWatch Alarms and configure notifications based on health check results.
Now, in addition to using Route53 DNS Failover and health checks to increase the availability of your website, you can also use Route53 health checks for website monitoring.

To learn more about AWS services, please join online tutorial on AWS.

 

Tuesday, July 23, 2013

AWS announces cheapest Archival system ever

We all know about Amazon S3, a famous storage solution provided by Amazon. S3 can be used for number of activities, it could be hosting static website on S3, store for images and videos, or as an origin for CloudFront.
Inspite of many feature provided by S3, it proves to be costly in case any one needs to keep backup on S3. When we deal with backups, we have to keep retention period of several months depending on the nature of your business. Keeping data for longer time adds up huge amount in your bills.

Now the question all of us can ask - How to keep the backup on cloud storage with a minimal cost?
That is where Amazon Glacier comes into the picture. Amazon Glacier is a low cost archival system provided by Amazon. You can host 1G of data for as low as $0.01. Isn't it amazing?

The best feature Amazon provides is the ability to archive data automatically from S3 buckets. You can configure policies for automatic archival. For example, you can configure that data older than 30 days can be archived automatically.

Here comes the big question? How to learn Amazon Glacier and how to integrate it with other AWS services. The answer is simple. Click here to join online course on AWS just for $30 and learn everything about AWS. This link provides direct access for enrollment with discount of more than 50%.

Hurry up! Join the course and start building your infrastructure on Cloud. 

LIVE discussion for AWS Cloud Computing

Guys!!!
Let's start learning AWS in an interactive way. The process is simple.
1. Enroll yourself for AWS online course just for $30. Here is the direct link to enroll and get discount.
2. For any doubt, concern, or query, please join the discussion at any point of time. Click here to join the discussion now.

Isn't it easy??? Above all, you will get a certificate after successfully completion of this course. So, be the first CERTIFIED AWS professional, NOW.

Monday, July 22, 2013

AWS Cloud Computing made easy

Don't miss the chance. Avail Cloud Computing with AWS course for as low as $30. Please click on this URL to enroll now and get discount - http://tinyurl.com/kbpaejd. Discount coupon is embedded in this URL, OR you can use the coupon code AWSCC2 directly.


Saturday, July 20, 2013

Learn Cloud Computing With Amazon Web Services - Introduction

 
 
PLEASE SPREAD THE WORD!!!

Hey Guys!!! Good news is finally here. Get yourself enrolled into online course "Cloud Computing with Amazon Web Services" and GET CERTIFIED. Yes! It is official.

First 50 Users will get MORE THAN 50% DISCOUNT. Just pay $10 and you will be a CERTIFIED PROFESSIONAL in AWS.
Please visit below link to enroll yourself and get discount instantly.
https://www.udemy.com/cloud-computing-with-amazon-web-services-part-1/?couponCode=OZAWS10
Here is the sample of certification you will get from udemy.com
https://www.udemy.com/static/images/course-certificate/certificate_sample.png
Don't be LATE.

Get CERTIFIED in Cloud Computing With Amazon Web Services

Hey Guys!!! Good news is finally here. Get yourself enrolled into online course "Cloud Computing with Amazon Web Services" and GET CERTIFIED. Yes! It is official.

First 50 Users will get MORE THAN 50% DISCOUNT. Just pay $30 and you will be a CERTIFIED PROFESSIONAL in AWS.

Please visit below link to enroll yourself and get discount instantly.
https://www.udemy.com/cloud-computing-with-amazon-web-services-part-1/?couponCode=AWSCC2

Here is the sample of certification you will get from udemy.com
https://www.udemy.com/static/images/course-certificate/certificate_sample.png

Don't be LATE.


Thursday, July 18, 2013

Learn Cloud Computing Using Amazon Web Services

Hey Guys!!!

Please Enroll: https://www.udemy.com/cloud-computing-with-amazon-web-services-part-1

First 50 users can avail MORE THAN 50% DISCOUNT. Please use coupon AWSCC2 OR Click on the below link to enroll and get discount instantly.

https://www.udemy.com/cloud-computing-with-amazon-web-services-part-1/?couponCode=AWSCC2

This course helps in learning following concepts with 100% practical sessions:
1. Create virtual servers in cloud.
2. Configure storage in cloud.
3. Configure CDN
4. Configure monitoring and notification services.
5. Bulk email solutions in the cloud.
6. Highly scalable DNS system in the cloud.

And much much more!!!!

There will be 5 LIVE Sessions designed for this course.

Wednesday, May 29, 2013

Upload a file to FTPES server using CURL

Security is a major concern when transferring file to another server. Nowdays, most of the file transfer is conducted over SSL which adds extra layer of security by encrypting the data. There are two types of such connections are possible.

1. Explicit over TLS server, which is also called FTPES
2. Implicit over TLS server, which is also called FTPS.

In this article, I will demostrate how to use Linux built-in and very powerful utility CURL to connect with FTPES servers.

Here is an example:

curl -T testfile.txt -k -v --ftp-pasv --disable-epsv --ftp-ssl domainname.com/directory

Lets break down above example to understand how it works.

The first part
curl
is fairly self explanatory, it invokes the curl command.

We then supply the -T option and the file which we want to upload.
-T testfile.txt

We then add an optional -k switch, which ignores any certificate related error.
-k

the -v just used for verbose output so we can see any error which may occur more easily.
-v

--ftp-pasv forces the connection to be made in passive mode. It connects in Active mode by default.

--ftp-ssl or --ssl tries to use SSL/TLS for the connection. It reverts to normal mode if server does not support SSL/TLS

At last but not least, provide your server's FQDN followed by the directory name.

Thursday, May 16, 2013

How to find Linux Kernel and release information


There are several commands to check Linux Kernel version and release information.

1. To check the Kernel Version

  • uname -v
2. To check Kernel Release
  • uname -r
3. To check system architecture
  • getconf  WORD_BIT
  • uname -m
4. To know every detail about system
  • uname -a
  • lsb_release -a
  • getconf -a

Please note that if you are aware of individual switches with each command, then it will be easy to use them in scripts. 

Wednesday, May 8, 2013

FTP authentication using PAM on Linux

This article explains how to configure PAM with VSFTP for authentication. It requires a database file that contains all the users and passwords.
To create a db format file, first create a plain text file e.g. 'virtual-users' with the usernames and passwords on alternating lines: It should look like as shown below:

user1
password1
user2
password3

Once usernames and passwords are added to the file, its time to create the database. You man need to install db_load command if it is already not there. Install is using yum install db4-utils

Execute following command to convert plain file to db format.# db_load -T -t hash -f virtual-users /etc/vsftpd/virtual-users.db

Now, create a PAM file /etc/pam.d/vsftpd-virtual which users your database. Add following lines in this file.
auth required pam_userdb.so db=/etc/vsftpd/virtual-users
account required pam_userdb.so db=/etc/vsftpd/virtual-users

Once done, restart VSFTP service. service vsftpd restart

Now you don't need to create system accounts for FTP use. Just add the new user and password in the file, rebuild the database and restart the service.

Tuesday, May 7, 2013

Listing files and directories using Python

This is an alternative to Linux ls command. A python script to list files and directories on Linux.

#!/usr/bin/python
import os,sys
def getSize(name):
    st = os.stat(name)
    return st.st_size
try:
    dirPath=str(sys.argv[1])
except:
    dirPath=str(os.getcwd())

for i in os.listdir(dirPath):
    if os.path.isdir(i):
        print 'Directory ',getSize(i),os.path.getatime(i),i
    elif os.path.islink(i):
        print 'Link ',getSize(i),os.path.getatime(i),i
    else:
        print 'File ',getSize(i),os.path.getatime(i),i

Monday, May 6, 2013

Linux Directory Structure


  • /bin
    • System binaries, including the command shell
  • /boot
    • Boot-up routine
  • /dev
    • Device files for all your peripherals
  • /etc
    • System configuration files
  • /home
    • User directories
  • /lib
    • Shared libraries and modules
  • /lost+found
    • Lost-cluster files, recovered from a disk-check
  • /mnt
    • Mounted file-systems
  • /opt
    • Optional software
  • /proc
    • Kernel-processes pseudo file-system
  • /root
    • Administrator’s home directory
  • /sbin
    • System administration binaries
  • /usr
    • User-oriented software
  • /var
    • Various other files: mail, spooling and logging

Wednesday, May 1, 2013

MySQL DB backup and restore

Backup all databases

    mysqldump -u root -p --all-databases > /var/mysql/backup/all.sql

Backup single database

    mysqldump -u root -p dbname > /var/mysql/backup/db.sql

Backup multiple databases

    mysqldump -u root -p --databases db1 db2 db3 > /var/mysql/backup/dbs.sql

Backup only specific tables in a database

    mysqldump -u root -p dbname tablename > /var/mysql/backup/table.sql

Restore database

    mysql -u root -p dbname < /var/mysql/backup/db.sql

Monday, March 11, 2013

Python - List environment variables

Hey folks!

Here is a simple python script to list out environment variables. It can be used to figure out value of any environment variable and use that dynamically in your scripts.

#/usr/bin/python
import os
for key in os.environ:
    print key,'=>',os.getenv(key)

This is an example for Python 2.7. If  you are using Python v3.x, you will need to use print statement as a function.