Autoscaling in Windows Azure

In better than 1/3 of my customer contacts, the term ‘autoscale’ comes up. This term (loosely defined) relates to increasing computing capacity to load levels (in non-fancy talk, it means you have enough servers to make sure you don’t tip over when too many users visit / interact with your site).

Companies are keen to ensure that they can take full advantage of the elastic aspects of the cloud: the ability to deploy more resources when they are needed, and take them down when they are not.

Some examples, you ask? There are more than a few application scenarios / patterns that present themselves. If your application fits any of these types of patterns:

    • On / Off application patterns: if you only need a system at definable intervals (payroll, training, accounting systems, and so on) .. the per-minute usage charges of Windows Azure go a long way toward cost savings and usage optimization.
    • Predictable burst activity pattern: maybe your site serves users around identifiable dates (Valentines’ day, Mothers’ day, the silly season, and so on). The elastic nature of Windows Azure Cloud Services enables you to supply additional capacity to manage your customer load requirements.
    • Unpredictable growth pattern: it is expensive and time-consuming to deploy server capacity before it is actually needed .. something startup companies deal with every day. PAYG and elasticity can help small companies grow on their own time, using current cash flow to support their capacity requirements.

Using an autoscaling scenario may meet your needs, optimizing usage / capacity considerations while balancing these against the ultimate cost. A few considerations:

  • Are you short on time and implementation skills, requiring an immediate and managed solution?
  • Do you have an experienced (or willing to learn .. my favorite) development / implementation staff?
  • Do you want something easy, inexpensive and unmanaged?

Depending on your scenario and competency:

    • If short on time and / or lacking implementation as a core competency, companies like RightScale offer managed (read: for a price) solutions to monitor and manage your capacity based on rules that you set. You can try RightScale at for free, but be mindful of ongoing costs and balance them against operational efficiencies.
    • If you have a bit more time and an able development staff, the Autoscaling Application Block is a component you can add to your code that will give you greater control over when and how your capacity will be increased. Ready to get started? Please check out How to use the Autoscaling Application Block from MSDN. Ready for a ‘geekend’? Here’s the Autoscaling Namespace.
    • For easy and unmanaged (well .. managed, but by you .. but not in real time), we’ve recently announced portal-controlled autoscaling options in preview. These features are available in the Windows Azure Portal for your cloud services.

On the third point above, Scott Guthrie posted an update to the platform from the Microsoft Build conference regarding GA for paid websites and mobile services AND autoscale support .. complete with screenshots (check out the “How to enable autoscale section”.

In short (too late!) options abound; at least one to meet the the technical requirements of your project, or the technical skills (and time allotted) to your dev team. Here we are: yet another way to improve and optimize our experiences in the Cloud!

I’ll see you there.

Advertisements

Schmaltz ..

.. sort of defines me. Not that I’d confess that ..but  let’s discuss:

So .. when I can, I find the Schmaltz story on the video player on a plane. You’ll LOVE how the Web defines it:

.. I get that (those).

You’ll find me, on the aisle, watching an unnecessarily sentimental film and feeling deeply about it.

Thanks Comcast!

New modem .. wahoo!

image

New Digs, New Speed

Just ran a quick speed test from my mobile phone in my new digs .. the first is over Wi-Fi (Comcast). The second is LTE (ATT).

image

In a word: wow.

Hiding Everything on Facebook ..

Dear Lord .. how does this happen?

I am sure, through a convoluted series of posts and ‘accepts’ that it does.

Let’s not let THAT happen again.

Yeah, I’m fifty (something), but I talk like I’m thirty (something)

Here’s what I know:

Beyond that, I also know:

  • TTBOMK (always fun).
  • STFU (reminding all that ‘U’ somehow manifests itself as ‘door’ for the FCC-proper-types).
  • WTF (Whisky-Tango-Foxtrot, which, in no way, shape or form can be FCC-proper).

For (my) whimsy, I’d add:

So .. where does that put me? Where does it put you?

When I was 25 ..

.. it was a very good year.

Granted, mine was styled in copper .. diesel, but a very nice ride.

1979-1985 Cadillac Eldorado

It Just Works: Excel 2010 to SSAS

It’s been a while since I posted a geeky article .. so, it’s well over time.

I have the pleasure of working with a talented data warehouse architect on my current project, and the need to connect Excel 2010 to SSAS became a reality this past week. The instructions to do so are easy enough:

  • Navigate to the Excel 2010 Data tab.
  • Click the ‘From Other Sources’ from the ‘Get External Data’ section of the ribbon.
  • Select ‘From Analysis Services’; you’re presented with a ‘server name’ and ‘credentials’ dialog. We are using a Windows Azure Virtual Machine for this project, so you may have to create an endpoint that maps to a obfuscated port number (write me for details). Fill out these fields, adding “: port number” after the server name and your local login information (ensure your local login information represents a local account with an SSAS role on the system).
  • Click “Next”, and here’s where it gets dodgy:

  • You should be presented with a “Select Database and Table” dialog box. Select the cube you want to use and click “Next”.
  • In the “Save Data Connection File” dialog, click “Authentication Settings”, and then “None”. Trust me on this: you won’t see all the screens you need unless you have the system force you to enter login information in another step.
  • Click “Finish”; you may be prompted to save the .odc file, replacing the old one (I did this many, many times).
  • You will then see the “Import Data” dialog, which lets you place the PivotTable in your current worksheet. Select the location and click “OK”.
  • Now the fun starts:

  • You may get an error dialog: “An error occurred in the transport layer”. Click “OK”. Because you selected “None” in the previous step, you’ll be presented with a new dialog, the “Multidimensional Connection”.
  • In the “Multidimensional Connection” dialog, select “Analysis Server”; you will see the server you identified earlier.
  • Your User ID should come over too .. enter your system password and click “Next”.
  • Select your database in the next screen and click “Finish”.
  • Your .odc file should now be set up properly; save your Excel sheet and re-open, you may see the “transport layer” error again, but after this, you’ll be prompted to re-enter your system password.

    Ideally, you should be able to cache this login information .. I’m looking into that and will update the post.

It was an Asteroid after all ..

.. confirmed (or, at least firmly postulated) by a herd of American and European researchers.

This bunch have tested (and re-tested) debris from the Chicxulub impact crater off the Yucatán Peninsula in Mexico. Their findings include a timeline that spans 11,000 years for the impact, almost simultaneous with the Cretaceous-Paleogene extinction. From the article:

When dealing with geological timescales, a range of 11,000 years is about as accurate as you can get. As the research paper puts it, though, “the Chicxulub impact likely triggered a state shift of ecosystems already under near-critical stress.”

Not a bad timeline after all. Please see: Finally confirmed: An asteroid wiped out the dinosaurs | ExtremeTech

Now. If we can only solve the chicken or the egg question.

Warming up to SkyDrive

Just a little. Someone mentioned SugarSync to me the other day .. may have to take a peek at it.

With the loss of Live Mesh (see Live Mesh.. We Barely Knew Ye), I’ve had to do a few things a little differently. Besides reliable sync, local storage and pinpoint sharing accuracy, two of the nice features of Live Mesh included were Email Signature and Favorites sync. with Mesh, you could simply click a property box, and those items get synced across any of the systems in the Mesh. From the looks of it, these are not support as easily In SkyDrive.

I haven’t solved the Email signature sync, but here’s how I resolved the Favorites sync:

Your IE Favorites are stored in a personal directory called Favorites in the \Users folder of your system. The simple way to ensure you can access your favorites would be to sync this folder. Under Mesh, the folder was left in place and connected into the Mesh. With SkyDrive, it’s a bit different .. instead of moving to the mountain, we move the mountain to us, In Windows Explorer (not IE):

  • Navigate to C:\Users\yourname\.
  • Right click the Favorites folder.
  • Click on the ‘Location’ tab.
  • Browse to a folder underneath the root of your SkyDrive folder.

In a few moments, SkyDrive will obligingly copy your Favorites folder, making it available to other systems. The process is the same on the other system .. navigate to the folder as above and point to the Favorites folder beneath the SkyDrive root. You might see some duplicates of system- or  OEM-created shortcuts, so replace with care.

I’ll keep looking for a way to sync Email signatures and advise. For the moment, I keep them in a SkyDrive-synced folder in a Word document, which works well enough.

%d bloggers like this: