Advertisements

Conor O'Mahony's Database Diary

Your source of IBM database software news (DB2, Informix, Hadoop, & more)

Archive for the ‘Cloud’ Category

Deploying DB2 and InfoSphere Warehouse on Private Clouds

leave a comment »

Cloud computing is certainly a hot topic these days. If an organization is not already using cloud computing, it has plans to do so. The economics, agility, and value offered by cloud computing is just too persuasive for IT organizations ignore.

Even the high-profile Amazon outage couldn’t slow cloud computing’s relentless march towards mainstream adoption. If anything, that outage helped make cloud computing more robust by highlighting the need for hardened policies and procedures around provisioning in the cloud.

IBM recently announced updates to a set of products that make it easy to deploy DB2 and InfoSphere Warehouse on private clouds:

  • IBM Workload Deployer (previously know as WebSphere CloudBurst), which is a hardware/software appliance that streamlines the deployment and management of software on private clouds.
  • IBM Transactional Database Pattern, which works with the IBM Workload Deployer to generate DB2 instances that are suitable for transactional workloads.
  • IBM Data Mart Pattern, which generates InfoSphere Warehouse instances for data mart workloads.

These patterns consist of more than just deploying virtual images with pre-configured software. You should instead think of them as being like mini-applications for configuring and deploying a cloud-based database instances. Users specify information about the database, and then the pattern builds and deploys the database instance.

The Transactional Database Pattern is for OLTP deployments. It includes templates for sizing the virtual machine, database backup scheduling, database deployment cloning capabilities, and tooling (including Data Studio). The Data Mart Pattern incorporates the features to the OLTP pattern, together with deep compression and data movement tools. But, of course, it is configured and optimized for data mart workloads in a virtual environment.

Advertisements

Written by Conor O'Mahony

December 12, 2011 at 5:40 pm

Hadoop Fundamentals Course on BigDataUniversity.com

with one comment

After spending some time reading about Apache Hadoop, I decided it was time to get my hands dirty. So this weekend, I took the Hadoop Fundamentals 1 self-paced course on BigDataUniversity.com. It is a really nice way to play with Hadoop. You have the choice of downloading the software and installing it on your computer, working with a VMware image, or working in the cloud. I chose the option of working in the cloud. Within a few minutes I had a Amazon AWS account, a RightScale account, and the software installed in the cloud. By the way, although the course is FREE, I did incur some cloud-related usage charges. It amounted to approximately $1 in Amazon charges for the time it took me to complete the course.

The course itself is quite good. It is, as the abstract implies, a high-level overview. It describes the concepts involved in Hadoop environments, describes the Hadoop architecture, and provides an opportunity to follow tutorials for using Pig, Hive, and Jaql. It also provides a tutorial on using Flume. Because of my experience with JavaScript and JSON, I feel most comfortable using Jaql to query data in Hadoop. However, the DBAs among you will probably feel most comfortable with Hive, given its SQL-friendly approach.

If you are curious about Hadoop, I’d recommend this course. I’m eagerly anticipating the availability of the follow-on Hadoop course…

Written by Conor O'Mahony

September 6, 2011 at 11:53 am

Free Class: IBM Software on Amazon Web Services (AWS)

leave a comment »

Are you interested in deploying IBM software on a scalable, secure, and on-demand cloud environment? If so, then you will be interested in a free 2-day class that IBM is offering in the Toronto area. Attendees will receive hands-on knowledge of the AWS building blocks. You will learn about the services at feature, function, and philosophy levels that will help you to understand the intersection points between AWS and IBM. You will also learn specific details about using IBM software like DB2, Informix, and WebSphere sMash in Amazon Web Services (AWS) environments. The class covers a wide variety of technical and non-technical topics, such as:

  • Cloud computing, virtualization, and AWS tools and technologies
  • IBM products and cloud computing solutions available on AWS
  • How to leverage existing IBM and AWS technologies to achieve Software as a Service
  • Hands-on sessions with AWS technologies and various IBM products on Amazon Machine Images
  • And more…

The class runs from 25 Jan 2010 to 26 Jan 2010. For more information and to sign up, see Amazon Web Services (AWS) training

Written by Conor O'Mahony

January 18, 2010 at 2:29 pm

Posted in Cloud, DB2 for LUW

How to Scale Transaction-Intensive Workloads in the Cloud

leave a comment »

IBM is actively working on making DB2 in the Cloud easy to deploy for its users. You may have read the recent announcements regarding DB2 in the cloud, including the partnerships with Amazon, Rightscale, and others. You may also have seen the news that you can transfer DB2 licenses for use in the cloud. Ready-to-deploy Amazon Machine Images, RightScripts, ServerTemplates, and more help you get up and running quickly.

There are already videos on the Web that show you how to deploy DB2 in the cloud. Next week, IBM is taking you on the next step on this journey by hosting a Webcast titled Scalability in the Cloud—Fact or Fiction?. In this Webcast, IBM and xkoto will show you step-by-step, how to scale transaction-intensive workloads in the cloud. They will also discuss the pitfalls that are commonly encountered when doing this.

Written by Conor O'Mahony

July 22, 2009 at 6:20 pm

Posted in Cloud, DB2 for LUW

%d bloggers like this: