Skip to main content

SAWSUG: Makara and SEOmoz

Tonight Amazon hosted their final Seattle Amazon Web Services User Group (@) meeting of the year. A grand showing of about 50 participants was kicked off by @ with a State of the Union including information about Jeff's new book and AWS's new free tier.

AWS free tier avail to experiment on getting started on cloud. 750 hrs machine time included. #sawsug #aws26 Oct via Mobile Web


After a short pizza break @ demonstrated how Makara leverages cloud infrastructure to make building and maintaining server stacks easy. While the Makara interface might not be something to write home about they support an impressive number of cloud providers and stack types. Even more impressive then the two minute stack setup is the plethora of statistics collected by their system. Everything from standard server metrics, to application performance comparisons, to transaction times. Makara even uses its own platform to spin up 120 EC2 servers when testing new code commits.


The final presentation was @ from SEAmoz on their unique usage of AWS. SEOmoz hosts and processes a significant amount of data with AWS including a 40 EC2 instance cluster for their API. The cluster servers and processes 8TB of data hosted on S3 in various sizes and forms of compression using a 500GB key index. In September their API servered over 125 Million requests for about 50 requests/second. SEOmoz has a 10 server cluster in colocation that crawls 300TB of web data every month and another 10 server cluster for servering seomoz.org.



Be sure to look for upcoming SAWSUG events as they are excellent chances to "learn how Amazon Web Services can provide efficient scale of IT infrastructure capacity to quickly to meet growing business needs.

Comments

Popular posts from this blog

CloudSense: the Future of Advertising

With the whole cloud taking off and more and more services switching to a push it into the cloud, leave it there until you need it, and pull it out model. I can only imagine what will be switching to this model soon. Oh wait. I can imagine.



Advertising!


Reading an article about how Avril Lavigne is supposed to have a $2 million check "appear" in her mailbox because of the absurd number of streams her videos get from YouTube got me thinking about creator compensation. The problem Avril is having is, a) Google wants to keep the money, and b) Google is having trouble figuring out how to monetize video streams. But on a grander scale it is whoever puts the ads on the page that gets the money not the content creator.

Little known @Twitter and @TwitterAPI tips and tricks

Be sure to comeback as new tips and tricks get added. If you know of anything I missed be sure to let me know.

Static URL for profile images based on screen_name:

https://api.twitter.com/1/users/profile_image/abraham

* This performs a http redirect to the actual profile image URL. Currently https redirects to http. You can also add "?size={mini | bigger | normal}" to get specific sizes.

Redirect to profile based on user_id:

https://twitter.com/account/redirect_by_id?id=9436992

In_reply_to_status_id mentions:

https://api.twitter.com/1/statuses/update.json?status=reply+to+@abraham&in_reply_to_status_id=12410413197

* In the web interface new mentions are only replies if they start with @screen_name. By pushing @screen_name further along in the string your followers who do not follow @screen_name will still see the status.

Profile image sizes:

http://a3.twimg.com/profile_images/54160223/chart-black-small.png

* By default you get the original image size you can add _mini, _normal, and …

Installing Storytlr the lifestreaming platform

"Storytlr is an open source lifestreaming and micro blogging platform. You can use it for a single user or it can act as a host for many people all from the same installation."

I've been looking for something like Storytlr for a few months now or at least trying to do it with Drupal. While I love Drupal and FeedAPI I did not want to spend all that time building a lifestream website. So I've been playing around with Storytlr instead and found it very easy. Here is how I got it up and running on a Ubuntu EC2 server. You can also check out the official Storytlr install instructions.

Assumptions:
LAMP stack installed and running.Domain setup for a directory.MySQL database and user ready to go.Lets get started!
Get the code: wget http://storytlr.googlecode.com/files/storytlr-0.9.2.tgz tar -xvzf storytlr-0.9.2.tgzYou can find out the latest stable release on Storytlr's downloads page.

Import the database:
Within protected/install is database.sql. Import this into your empt…