IBM Cloud Private – Agile Infrastructure

Overview
IBM Cloud Private enables Cloud Services and an Agile Infrastructure behind a company firewall. While Cloud Private can be deployed in the Amazon and IBM Clouds in a supported deployment case, there exists a sweet spot for companies realize the value of deploying IBM Cloud Private to expose workloads as Services, rather than Servers within their on-premises environment.

Agility and Utility
Let’s use an example of something near and dear to all of us: our mobile. We all have one (or more):

CPMobile We’ve made this device ours by customizing to within an inch of its life to suit our needs. You likely have:

  • A workout application
  • Any number of credit card and / or banking applications
  • Shopping applications

Note that your phone is agile .. You can add / remove applications as you need. A brief use case .. You want to attend a concert:

  • You download the application
  • Enter your information
  • Purchase tickets
  • Show the e-ticket image when you arrive at the venue
  • Snap photos, snippets, uploading like mad during the concert

When all this is done .. You can delete the application to make room for other applications.

Mobile Games? Yes: you might upgrade your mobile game from that jewel thing to that bird thing .. deleting the unwanted game when you’ve won.

This is also quite similar to a familiar Utility Army Knife .. we’ve all seen this one:

CPUtilityKnife 322x294 You use:

  • The scissors for thread
  • The blade for preparing food
  • The saw for cutting wood
  • The bottle opener for wine
  • The file to get yourself out of prison

You snap out the tool you need, use it and then snap it back when you’re finished, moving round to the next task at hand.

You need not carry an entire toolbox with you .. this is something that would weigh you down. This device assures you can finish the work you need at the moment, and then move on to your next bit.

This is IBM Cloud Private: Cloud Services and Agile Infrastructure behind your firewall.

IBM Cloud Private – Innovative and Open
IBM are on to something with Cloud Private. IBM has leveraged assets from the world of Open Standards, building an execution platform that includes policy-based oversight, Role-Based Access Control (RBAC) and a variety of alerting and monitoring functions .. all of which are supplied OOB. Add to this the capability to deploy Services rather than Servers .. Services that include the bulk of the IBM Middleware portfolio, offerings from other vendors and applications from the Open Source Community.

This is not a ‘Rip and Replace’ effort .. This is ‘Augment and Enhance’, ‘Consolidate and Optimize’. This is Agile Infrastructure, within your on-premises and cloud environments.

IBM Cloud Private – What Does it Do?
Some high-level pillars. IBM Cloud Private:

  • Provisions Cloud Services behind your firewall .. IBM offers fully-supported IaaS solutions for Cloud Private as well.
  • Enables an Agile Infrastructure, where you run Services, rather than CapEx Servers, many in a PayGo pricing model.
  • Has made available 100s of Services from the Catalog (think of it as an application store for your infrastructure) .. with that said, any Application packaged into a Helm Chart (by a vendor or by your company) can be deployed into a Cloud Private cluster.

Key bits: Services rather than Servers and an optimized execution model.

IBM Cloud Private – Architectural Discussion
IBM Cloud Private deploys atop commodity hardware, running on the Linux Operating System. To deconstruct:CP Architecture

  • IBM Cloud Private exposes nodes to manage operations and enable Services.
  • Applications run as Services atop the Cloud Private nodes, providing access to Open Source applications, built-in monitoring / management capabilities and the IBM Middleware catalog.
  • Cloud Private has a number of Onboard services for Applications, Monitoring, Management, Alerting, Scheduling, and so on, integrating with the applications you use today.

This discussion is hyper-simplified for brevity. Please see IBM Cloud Private 3.1.0 Architecture for a deeper look, or ring me up.

The Customer Environment
By most accounts, your environment looks a bit like this:CPInfrastructure

  • You’ll see several hardware, virtualization and deployment layers within.
  • While you may have some automation in place, it is likely not standardized across deployment paradigms and capabilities, and it probably doesn’t give us an all-up view of how to best manage all the infrastructure assets within your organization.
  • Each of the blue boxes potentially represents:
    • A licensing requirement
    • A management requirement
    • A hardware requirement ..

.. all well before we get to actually servicing the users, devices and systems that are most important to our audiences. To this, I mean:

  • Recognizing CapEx licenses that require renewal
  • Maintaining patch levels across all the operating systems above
  • Infrastructure (hosts for VMs and / or physical servers) .. taking into account hardware refresh requirements

.. and the list goes on.

Some Notes on Workloads
When thinking about your workloads, you’ll likely realize:

  • Most are running 100% of the time (atop single-purpose virtual or physical hardware)
  • You’re paying for a software license 100% of the time
  • These workloads are not running at capacity 100% of the time

You pay for the hardware and the licenses 100% of the time .. given the bits above, let’s think about another way to deploy these workloads. In a perfect world, how should these workloads run?

  • Transient: specific-use workloads that you deploy, run and remove as your needs dictate. This is similar to the mobile device concert analogy, above. Note that these are the de facto standard for proof of concept, testing or introducing new workloads into an environment .. once you deploy Cloud Private, you can evaluate these workloads as Services, rather than deploying Servers.
  • On-Demand: pre-built, pre-configured and deployed on a moment’s notice .. note that these services can be spun up and available in seconds, versus VM / Physical Server start times.
  • Long-Running: Ongoing workloads for management, monitoring and alerting functions. These services are always available, at minimal PayGo cost. When they need to be scaled (dayparts, data volumes, activity), they can be, via defined policy.
  • Scalable: workloads with defined criteria that can expand to available capacity; the scalability typically triggered by capacity demands and under policy. These can be Transient, On-Demand or Long-Running workloads .. Again, under policy.
  • Burst: workloads scheduled during times the system is idle to increase optimization, and are scaled back to continue processing until the next burst opportunity, again, all under policy.

Ask Yourself: How many of the workloads you are running 100% of the time, consuming 100% CapEx licensing / hardware requirements might otherwise fit into one of the above paradigms?

Determine Applicability
You should have a look at IBM Cloud Private if a number of the following conditions exist in your environment:

  • CapEx licenses that require renewal .. Especially where these licenses can be implemented as PayGo Services
  • You’re running a lot of VMs with single or minimal workloads .. Especially where these VMs have CapEx licenses, as above
  • You’re running a lot of VMs that are consuming too much of your infrastructure capacity, requiring more management than you’d like to dedicated
  • You deployment includes multiple clouds (Public, Private or Hybrid)

Lots of questions above, likely requiring input from others in your organization .. Consider all the folks who have their hands on the keyboards, managing your infrastructure and deployments.

What you need:
An Agile Infrastructure session. In this session, we’ll unpack:

  • Your Use Cases / Execution
  • Your Deployments: Physical / Virtual
  • Your Licensing: CapEx / Subscription
  • Your Workload Management
  • Your Cross-Workload Integration
  • Your SaaS Integration Opportunities

.. and more. I can put you in touch.

Windows Azure Memory-Intensive Instance Options

There’s a FABULOUS, eye-catching headline, to be sure .. I’ll work on it.

It is exciting news though. Just as soon as Windows Azure announced support for Infrastructure as a Service (IaaS, for short), IT folks came out of the woodwork seeking customized sizing options that were outside the original five instance sizes offered by the service.

Now, while the A5 and A7 IaaS instances have been out for a while, the recent update includes the addition of a A6 IaaS instance, plus Cloud Service versions of the same capacities. This allows developers to deploy their memory-intensive applications in the Platform as a Service paradigm and save the IT department from having to manage from the operating system ‘up’ as with IaaS.

Name CPU Cores RAM
A5 2 14
A6 4 28
A7 8 56

Large memory instances are also available for both the Linux and Windows operating systems.

Pricing? Always. The Cloud Service and IaaS pages are updated with the full set of instance sizes:

Larger memory instances give developers and IT professionals the means to move more of their applications and IT assets into the cloud than ever before, no longer impacted by high memory usage forcing refactoring or workarounds. All in all, another step in the great journey to the cloud!

I’ll see you there.

PowerShell: Must have Microsoft Online Services Sign-In Assistant

I posted “Office 365 and PowerShell” a few weeks back and since then have been digging into the various ways PowerShell helps manage Office 365 installations. You’ll find a robust command set that can automate a wide number of deployment and management operations.

Setup (was) pretty straightforward the last time I did it (a few months back): download and install the Microsoft Online Services Sign-In Assistant for IT Professionals Beta and the appropriate cmdlets (please see my prior article) and you’re ready to go.

Now, I’ve rebuilt my system since then, so it’s time to do it over again. However, this time, I had mixed results. I installed the RTW version of the Online Services Assistant with a reboot. Then, I ran the installation program for the cmdlets .. then I got this:

“In order to install Windows Azure Active Directory Module for Windows PowerShell, you must have Microsoft Online Services Sign-In Assistant version 7.0 or greater installed on this computer”

Huh? :: mutter ::  Didn’t I just do that?

So, after a lot of uninstalls, reinstalls, reboots and more reinstalls, I ascertained there is a system check between the Beta and RTW bits that is failing. Some Bing-ing, Google-ing and swear-ing .. I came across a social post on MSDN: “Cannot install Azure Active Directory Module for Windows PowerShell. MOSSIA is not installed”, installing the Beta and then the RTW, but with an added registry fix that increases one parameter to the later version to satisfy the system check.

The fix seems to work, and the author of the post provided two .reg files: one to make the change and the other to undo it. My suggestion (as Microsoft will certainly distribute newer versions) is to perform the Beta and RTW installs, then export the registry settings so you can revert, just in case.

HTH.

Backup / Restore Windows System Disks in Windows Azure

As we all continue to embrace the Cloud (Windows Azure, in particular) as our machine-of-choice for commodity IT operations, the requirement to capture current-state VMs becomes a day-to-day reality for IT professionals. Let’s discuss some options to consider:

    Whether a developer or an IT Guy, the Cloud is a component of your toolbox you should consider. It’s a great journey!
    I’ll see you there.

Active Directory in Windows Azure ..

.. verry interesting.

It’s more than a Laugh-In reference, however. The Windows Azure Identity folks have built out an amazing way to federate identity across multiple platforms and locations, hybrid and cloud .. essentially arriving at confirming the proper person has arrived at the doorstep, requesting services. How would you like to:

    .. if these fit your use cases, please reach out. I’ll connect you to the proper folks.
    It’s a great journey to the Cloud!
    I’ll see you there.

Windows Azure SQL Database Premium Preview

As of today, Microsoft is offering access to a limited preview of the new Windows Azure SQL Database Premium service. As opposed to non-Premium, the new offering provides reserved instances for databases requiring higher performance capacity and predictable performance.

To sign up:

  • Visit the Preview Page and click the ‘Try it Now’ button.
  • When approved, visit the Windows Azure portal and request a Premium Database Quota assigned to your Cloud Services.

For the Preview, customers will be limited to one database per logical server, priced at 50% of the eventual GA pricing. Please see the Windows Azure SQL Database Premium page for more details and availability updates.

Windows Azure GA Update: Web Sites, SSL and Mobile Services

Wahoo .. Windows Azure Web Sites Standard Tier (Reserved are already in GA) are now released to General Availability, along with SSL support .. a big boon for folks seeking to deploy scalable, highly-available commerce-enabled web sites. If you’re ready to get started, sign up for a trial and check out:

Windows Azure Mobile Services are good to go in GA as well, sporting a high-availability SLA for services running in Standard and Premium tiers. Mobile Services makes it fast and easy to create a mobile backends for a number of devices, simplifying authentication and push paradigms .. why reinvent the wheel? Besides the services aspects, a number of native SDKs for devices and Windows Store are available today.Here are a few links to get you started:

For some of the latest bits, please navigate to the Build 2013 site held in June 2013 .. the session content is online and as fresh as it gets, speeding your way into the Cloud!

I’ll see you there.

Windows Azure at WPC

Finally taking a break, triple-tall-vanilla-soy-latte in hand and electrons flowing into my laptop. Windows Azure news from the Microsoft Worldwide Partner Conference 2013 in Houston:

  • The Windows Azure SQL Database will boast a premium offering shortly. As a shared database-as-a-service platform component, tenants can suffer from ‘noisy neighbors’, where other customers’ activities can impact your database performance. The premium offering will help manage this by assuring higher bandwidth and dedicated CPU capacity.
  • Windows Azure Active Directory integration will improve over time as well. Microsoft is working with several third-party SaaS vendors to integrate identify services with AD.
  • Limited previews of these updates will be made available in the coming weeks.

Autoscaling in Windows Azure

In better than 1/3 of my customer contacts, the term ‘autoscale’ comes up. This term (loosely defined) relates to increasing computing capacity to load levels (in non-fancy talk, it means you have enough servers to make sure you don’t tip over when too many users visit / interact with your site).

Companies are keen to ensure that they can take full advantage of the elastic aspects of the cloud: the ability to deploy more resources when they are needed, and take them down when they are not.

Some examples, you ask? There are more than a few application scenarios / patterns that present themselves. If your application fits any of these types of patterns:

    • On / Off application patterns: if you only need a system at definable intervals (payroll, training, accounting systems, and so on) .. the per-minute usage charges of Windows Azure go a long way toward cost savings and usage optimization.
    • Predictable burst activity pattern: maybe your site serves users around identifiable dates (Valentines’ day, Mothers’ day, the silly season, and so on). The elastic nature of Windows Azure Cloud Services enables you to supply additional capacity to manage your customer load requirements.
    • Unpredictable growth pattern: it is expensive and time-consuming to deploy server capacity before it is actually needed .. something startup companies deal with every day. PAYG and elasticity can help small companies grow on their own time, using current cash flow to support their capacity requirements.

Using an autoscaling scenario may meet your needs, optimizing usage / capacity considerations while balancing these against the ultimate cost. A few considerations:

  • Are you short on time and implementation skills, requiring an immediate and managed solution?
  • Do you have an experienced (or willing to learn .. my favorite) development / implementation staff?
  • Do you want something easy, inexpensive and unmanaged?

Depending on your scenario and competency:

    • If short on time and / or lacking implementation as a core competency, companies like RightScale offer managed (read: for a price) solutions to monitor and manage your capacity based on rules that you set. You can try RightScale at for free, but be mindful of ongoing costs and balance them against operational efficiencies.
    • If you have a bit more time and an able development staff, the Autoscaling Application Block is a component you can add to your code that will give you greater control over when and how your capacity will be increased. Ready to get started? Please check out How to use the Autoscaling Application Block from MSDN. Ready for a ‘geekend’? Here’s the Autoscaling Namespace.
    • For easy and unmanaged (well .. managed, but by you .. but not in real time), we’ve recently announced portal-controlled autoscaling options in preview. These features are available in the Windows Azure Portal for your cloud services.

On the third point above, Scott Guthrie posted an update to the platform from the Microsoft Build conference regarding GA for paid websites and mobile services AND autoscale support .. complete with screenshots (check out the “How to enable autoscale section”.

In short (too late!) options abound; at least one to meet the the technical requirements of your project, or the technical skills (and time allotted) to your dev team. Here we are: yet another way to improve and optimize our experiences in the Cloud!

I’ll see you there.

It Just Works: Excel 2010 to SSAS

It’s been a while since I posted a geeky article .. so, it’s well over time.

I have the pleasure of working with a talented data warehouse architect on my current project, and the need to connect Excel 2010 to SSAS became a reality this past week. The instructions to do so are easy enough:

  • Navigate to the Excel 2010 Data tab.
  • Click the ‘From Other Sources’ from the ‘Get External Data’ section of the ribbon.
  • Select ‘From Analysis Services’; you’re presented with a ‘server name’ and ‘credentials’ dialog. We are using a Windows Azure Virtual Machine for this project, so you may have to create an endpoint that maps to a obfuscated port number (write me for details). Fill out these fields, adding “: port number” after the server name and your local login information (ensure your local login information represents a local account with an SSAS role on the system).
  • Click “Next”, and here’s where it gets dodgy:

  • You should be presented with a “Select Database and Table” dialog box. Select the cube you want to use and click “Next”.
  • In the “Save Data Connection File” dialog, click “Authentication Settings”, and then “None”. Trust me on this: you won’t see all the screens you need unless you have the system force you to enter login information in another step.
  • Click “Finish”; you may be prompted to save the .odc file, replacing the old one (I did this many, many times).
  • You will then see the “Import Data” dialog, which lets you place the PivotTable in your current worksheet. Select the location and click “OK”.
  • Now the fun starts:

  • You may get an error dialog: “An error occurred in the transport layer”. Click “OK”. Because you selected “None” in the previous step, you’ll be presented with a new dialog, the “Multidimensional Connection”.
  • In the “Multidimensional Connection” dialog, select “Analysis Server”; you will see the server you identified earlier.
  • Your User ID should come over too .. enter your system password and click “Next”.
  • Select your database in the next screen and click “Finish”.
  • Your .odc file should now be set up properly; save your Excel sheet and re-open, you may see the “transport layer” error again, but after this, you’ll be prompted to re-enter your system password.

    Ideally, you should be able to cache this login information .. I’m looking into that and will update the post.
%d bloggers like this: