Module 1: Provisioning Durable Storage with S3

Amazon S3 Service is the cloud storage which provides durability (99.999999999%), availability, and highly-scalability. Amazon S3 store data into the S3 Buckets and Bucket must have a unique name.  Amazon S3 store data as an object with a simple web service interface and retrieve any amount of data from anywhere on the web.

In this lab, I will use Amazon S3 buckets as primary storage as a bulk repository for user-generated content. Those buckets will store images, reference, logo, and other relevant data and images files.I also make replacement of buckets URL with local storage location after uploaded file and images into buckets. So that I can receive files and images directly from Amazone S3 cloud services instead of the local machine.

Task lists for module 1: 

  • To create the appropriate buckets and folders to store the images and scripts
  • To change the address of the S3 buckets URL with local storage location for fetching the data and images from cloud
  • To Enable public web hosting for S3 Buckets

The architecture of the project:

diagram1.png

Steps for the task: Creating the appropriate buckets and folders to store the images and scripts

  1. I have created two Amazon S3 buckets in my Amazon Cloud Accounts. The names of the buckets are: “dinostoreresourcess” and “dinostoredegradeds”.  Those two buckets will host the ‘dinostore’ website data such as images, HTML, CSS configuration files.1.jpg
  2. The first bucket name”dinostoreresourcess” which I called resources buckets.  I created five folders under resources bucket such as configuration for configuring files, CSS for stylesheets, js for java scripts, productimages for product photos, and siteimages for site background images.2.jpg
  3. I do have those respective folders resources on my local machines. Presently, my website is running from my local resources folders.3.jpg
  4. Now I will upload contents such as images, scripts from my local machine to Amazon S3 buckets folders respectively.

On the product image folder (productimages), I uploaded product images of the website from my local machine product image folder “D:\NET702orginal\NET702.DinoStore\NET702.DinoStore\Content\images”4.jpg

5.jpg

On the site image folder (siteimages) of Amazone S3, I uploaded site images from my local machine location of site images folders “D:\NET702orginal\NET702.DinoStore\NET702.DinoStore\Content\images”6.jpg

Steps for the task: Changing the address of the S3 buckets URL with local storage location for fetching the data and images from cloud

As my intention is to fetch data from the cloud location instead of my local machine. In this task, I will open my project (NET702.DinoStore) via Microsoft Virtual Studio, and I will change the location of product images and site images.

  1. Now I will change background image URL on the local bootstrap.css file. First I copied the appropriate image link of glyphicons-halflings.png from the siteimage folder of the S3 dinostoreresourcess bucket.7.jpg

Thereafter, I pasted the copied background image URL in the background URL location of the bootstrap.css file (highlighted location of my screenshot) in the open virtual studio project. In this way, I can change the location of local URL to cloud URL location on the project script.8.jpg

2. Now I moved my local bootstrap.css script file to my amazon S3 CSS folder of “dinostoreresourcess” bucket. 9.jpg10.jpg

and bootstrap.js and jquery-2.0.2.js to the amazon S3 JS folder of “dinostoreresourcess” bucket.

11.jpg12.jpg

3. After uploading the three scripts files such as bootstrap.css, bootstrap.js and jquery-2.0.2.js. In this step, I will change the location of those three files appropriate S3 Cloud URL with the local file reference URL under the Index.html folder of solution item in Virtual Studio.

***Index file will be called from S3 cloud location instead of the local machine when the website will be run.

Copy the links of those three scripts file and paste it to the pointed location of Index.html file in the Microsoft virtual studio. Now my website index location has been changed.13.jpg

4. Now I changed my welcome note of Index.html file. It is a welcome message that my customer can view when they will log in on my website. You can change it to your preference.14.jpg

5. I have done some changes to my main Index.html file. Now I will save the file to apply the changes. Note:  This Index.html file is the main index file of the website.

Now I will upload the modified index.html file on the Amazon S3 bucket (dinostoredegradeds) for fetching my index data from cloud system. 15.jpg16.jpg

6. Now I will make my index.html file public. If you don’t make your file public, Customers can’t view images and regard contents of the websites. The site result will be blanked. Therefore, before enables web hosting, you must make your index file publicly accessible.17.jpg18.jpg

Steps for the task: Enabling public web hosting for S3 Buckets

As my website is a static because it contains fixed content for web pages and pages are coded in HTML and customer will see the same information I have uploaded in my website amazon S3 storage. Amazon S3 Storage only supports static web sites strategy.

The other kind of website is called dynamic website is known as a server-side dynamic web page which construction is controlled by an application server processing scripts.  In server-side scripting, parameter determines how the assembly of every new web page proceeds, including the setting up of more client-side processing.

  1. Now I made my “dinostoredegradeds” bucket act as a website by enabling website hosting. For this task, You have to go bucket properties and select “Static Website hosting” and click “enable”.19.jpg
  2. Now pointed the index document setting via appropriate index.html file and you can also make an error document such as error.html.20.jpg
  3. In this step, I made my “dinostoreresourcess” bucket public so that customers of my websites can access folders files anywhere across the globe.

If you don’t make your resources buckets contents public, your customer will not able to access your resources. You can try to open one of your product image files by using URL, But It will show fail because of that file is not being public yet.

For making bucket folders public, Click the bucket “dinostoreresourcess” and select all folders. Go to the “More Option” and select “Make Public“. Then click “Make Public“. All contents of the folders of S3 Buckets “dinostoreresourcess” will be publicly accessible.21.jpg22.jpg

3. As I enabled my S3 bucket for web hosting; In this Step, I will configure my master file which has contained my site file configuration information. Therefore, my running code of websites will use the S3 objects.

For changing Master page of Site1.Master in the Microsoft Virtual Studio, on the location of href: bootstrap.css, jquery-2.0.2.js,bootstrap.js links to point at the cloud S3 appropriate files links URL. (Apply the copy and paste procedure as I did for index.html file).23.jpg

Now I will change site logo link to the cloud logo image. Copy the link of the logo.png link from siteimages of Amazon S3 bucket.24

paste it to the point of site image link URL location of the Site1.Master file in the Virtual Studio.

26.jpg

4. Till now I have changed file location in my sources code scripts, but My website data will be access and store in the database. Therefore, I need to change product table contents references for my database as well.

I am using MySQL database system for this project. My local database has been already created. I am using MySQL Workbench for connecting my MySQL database system. I already created two schemes: “dinostoredb” and “dinostoremembershipdb”.27.jpg

In the “dinostoredb”, all of the tables are stored. 28.jpg

In my dinostoredb schema, I will run the following query  ‘select * from products’ to check my product imageReference field. As I already changed it. You can change it by replacing your local location address with S3 cloud object link address by copy and paste or through update query script.29.jpg

Run and compile Source code:

Now I will compile the project of NET702.DinoStore on the “Default.aspx “object for inspecting the changes I already made. I will check that is my contents still calling from local storage or is being pulled on data from Amazon S3 cloud storage!

  • I will go to “Build”  tab of  Visual Studio. Build works as a compiler of the programming code on the visual studio. If Build gets any error during compiling the code, It will return errors. 30.jpg
  • As my code is showing no error after compiling, Now I will run it in any of the browsers. My default browser is “Google Chrome” but you can set any of your favourite browsers as your default browser to run the program. 31.jpg
  • Finally, We have to inspect on images and logs in the home page to confirm that all of the contents are running from cloud link or local link.

Click on logo element to discover the fetching location: Yes, It is from Amazon S3 Cloud location32.jpg

Click on the dinosaur images: Yes, It is also fetching data from amazon S3 Cloud33.jpg

At the end of module 1: We can see all of our website images is pulling from the Amazon S3 cloud storage. This was my first step to moving my website from the local machine to cloud system.

I’ve not found any errors during my migration of product image storage to cloud. If you get any problem during your migrating please mention your problem in the comment box, I will try to give you the solutions.

The next module I will discuss “Using RDS with ASP.NET ApplicationsURL”.

Thank you 🙂

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s