The TED Comment Acquisition System is Cool

.. I’m sure there’s a better name. However, at the bottom of each video, you see:

image

IMHO .. that rocks.

Building a Windows Azure Development Environment

Happily, this post has become irrelevant thanks to the New and Improved Microsoft Web Platform Installer (WebPI). That spiffy little kit interrogates your system for the proper dependencies and installs the Microsoft Web Stack, tools, SDKs and the like. To get started, click on the link.

Microsoft updated the Windows Azure Training Kit to June 2012 as well; you’ll find plenty of information therein .. especially labs using the new bits and highlighting the new features of the platform.

The content below is preserved for archive only:

As I’ve been working with some of the best and the brightest the WAISG has to offer, I think it’s time to provide a link to assist others in some Windows Azure 101 (a/k/a “Getting Started”) bits and pieces. In this post, I’ll cover setting up your development environment on a Windows 7 or Windows Server 2008 R2 system.

  • Make sure you have the current Service Pack for your operating system. The easiest way to do this is to click on the Orb (or Start, in WS2008R2) and type ‘Windows Update’. Windows Update will detect if your system is patched to current levels.
  • Install Visual Studio 2010 or Visual Studio 2010 Express (the free version). Be sure to check Windows Update again after installation (and rebooting) to ensure you have the current Service Pack (SP1) installed.
  • Using the Web Platform Installer, Install the Windows Azure SDK for .NET.
  • Visual Studio 2010 Express installs SQL Express by default, but your development environment may include a full (or development) version of SQL Server. In either case, check Windows Update for a current Service Pack for your version. The SQL Express management UI is a separate download: SQL Express Management Studio Express. If you have full SQL Server, the UI is included. For help running the UI, please see “Using SQL Server Management Studio” on MSDN.
  • Install the Windows Azure Training Kit. This kit is chock-full of information, tutorials and source code. The current version is “January 2012”. As this is updated frequently, I suggest you do the full install (about 500 mb) and install into a separate folder on your hard drive.
  • Get the additional bits you need for the labs. To do this, click on “Prerequisites.htm” in the folder you installed the Windows Azure Training Kit. This will start an application that will interrogate your system and advise any components you need for the labs you want to run.

:: whew ::    Only a few more steps (I promise).In order to use the local emulators:

  • Compute: You MUST run Visual Studio as an administrator (link to a Windows 7 Forum, but works for both W7 and 2008).
  • Storage: Your logged in user MUST be a member of the SQL Server sysadmin group (link to David Browne, who provides a script that does this for a local user .. as long as that local user is an administrator of the local system .. otherwise, contact your IT). This is required as the local user must be able to create databases for storage of Blobs, tables and Queues during development.

With these bits installed, you should be able to conquer any of the labs in the Windows Azure Training Kit with ease .. and speed your way into the Cloud!

I’ll see you there.

WAISG: Windows Azure Enablement Resources

So. Where have I been this past TWO months?

I am pleased to explain my whereabouts: I’ve joined the Windows Azure  Inside Solution Group. The WAISG is a business and technology enablement team devoted to accelerating the velocity of Windows Azure deployments worldwide.

Whew. Quite a mouthful, that. The program statement is worse .. but contains only one comma.

Our team supports 35 countries in 7 languages and will get you to the next step of your Windows Azure deployments. In a lot of cases, we’ll point you to public resources; in others, we’ll emulate and escalate as we need to get the job done with you.

We’re easy to reach: navigate to the Windows Azure home page at http://windowsazure.com and select the ‘Click to Chat’ bubble. We’ll pick up.

The best part: I get to work with some amazingly talented people, all of whom subscribe to my favorite quote: “When you stop learning, you stop growing”.

Let me tell you: this team is growing like mad .. collective brains learning from our customers and each other .. every day. It’s a great group, and one with whom I am proud to be guiding, mentoring and growing my own knowledge.

You’ll be hearing from us.

So .. who misses me?

Hell, I find that I do. I hope that you do, now and again.

This is the mea culpa “I am sorry I have not been writing” post. That said, I am sincere .. I really, really, really miss posting. This is a sincere apology to my readers and to myself.

What have I observed in the world since my last post? Well:

  • Republican madness
  • Democratic sadness
  • Regime changes
  • Organization re-arranges

And in me?

  • Growth: personal, and for those with whom I work.
  • Growth: personal, and for those with whom I live.
  • Assessment and re-assessment.
  • Growth.

Watch this space for the nuts and the bolts. Thanks for keeping me in your readers.

But .. it’s only three hours ..

image

Whoops. Note the system time (upper right-hand corner) versus the BIG display time (the one I actually use) .. fortunately, the alarms appear to be connected properly.

I think I’ll set my night stand alarm .. just on case.

HTC Aria on ATT with 2.1.

Flash? Who needs Flash?

Not Apple, and now, not Microsoft.

For background, Adobe Flash is a browser plug-in that enables rich media and rich user interfaces. Over time, we’ve all used it for YouTube videos, spiffy re-sizing menus and games like FarmVille and Mafia Wars.

In fact, Flash has been the de facto standard for rich UI over the past decade, eclipsing all others (including Silverlight .. the Microsoft entry in the space).

Ahh .. Silverlight. I barely knew ye.

That’s a lie: While at Microsoft, I worked diligently launching Silverlight 1.0, engaging worldwide partner adoption for early efforts. Painful, but we had some exciting, .NET-driven, browser-based applications adopting the plug-in playbook. The advantage: one code line .. developers could write code with known conventions, extending their .NET experience into a new, plug-in world.

I digress, therefore, I am.

For more background, here are a few, well-known fun facts (at least, in the developer community):

So. Mobile issues aside. The answer? HTML5.

HTML5 boasts a number of syntactical features (features and functionality that confirm to a language .. provided as part of a platform) .. which eliminates the need for a plug-in.

  • Want videos? Embed a <video"> tag .. built into HTML5, which includes position, height, width, codec, etc., etc. and etc.
  • Want absolute positioning? It’s there, built into HTML5.
  • Want SEO (Search Engine Optimization?). it’s built into HTML5.

If the operative term in all cases is: “built into ..”, suffice to say: it is.

Why do I bring this up? Well, TechCrunch (and a host of others) report: “The company announced today Microsoft Excises Flash And Plugins From Metro Internet Explorer In Windows 8”. The title of the article says it all: The shipping browser atop Windows 8 will not support (or need) vendor plug-ins.

This is significant .. remove the platform initiative, and you remove the need for developers to write to the platform.

Hey HTML5 developers: Start here, and here, and here.

Amazon: $185.00 for Shipping on a $15.20 order?

Wow .. something is broken:

image

This is an order for 20 2GB USB flash drives .. Hunter keeps losing his. They are listed at $0.76 each .. PERFECT for my need.

But $185.00 shipping? Madness.

Where is the ‘cancel’ button?

“Sunrise” Phase for the .XXX Top-Level Domain

I posted “Your Brand in Triple-X (and I don’t mean “times three”)” upon hearing the .XXX (Adult Industry Domain Extension) TLD discussion was alive and well. Quite well, in fact .. looks like the TLD is moving into an early operational phase, called “Sunrise” by Network Solutions.

The “Sunrise” phase comes in three flavors, each requiring validation to proceed:

  • Sunrise AT – Adult Trademark Holders: Companies who hold trademarks for adult products and services and want first crack at the .XXX domain extension for their business. This group must plan to host a live web site and be able to prove product and service trademarks.
  • Sunrise B – Block for Trademark Holders: A block for companies who hold trademarks for products and services and want to protect their Trademark from getting a .XXX treatment. Think Microsoft.xxx or Google.xxx (wouldn’t sit well with the companies, I’m sure). This group is NOT intending to host a live web site using the .XXX TLD and must prove product and service trademarks.
  • Sunrise AD – Adult Domain Holders (Grandfathering): Companies who have current .com, .net, .etc., domains and want to claim the .XXX for their current site. This group must plan to host a live web site and be able to prove product and service trademarks.

I suspect the rub will be between the AT and AD groups .. I’m certain there are some product crossovers. Maybe we’ll even get WhiteHouse.com back!

Will be fun to watch. For more details, see the .XXX Pre-Registration page on Network Solutions.

‘Pottermore’ (mostly) Opens for Business

“Mostly” because, Pottermore is only open to the first one million, invite-only Harry Potter fans. Sadly, I’m not among them.

I am a fan, though. I have (and have read) both the US and UK versions of the seven-book saga .. in hardback, no less. I have been known to be bemused at the differences between the two.

Pottermore seeks to engage readers and fans in an interactive world of wizards, witches, wands and more (I’m hoping for “wow” as well). The site will open to the public on October 18th, 2011, although public registration will begin earlier (details on the site).

Ms. Rowling describes the experience in the video below. Quite engaging.

Entertainment Weekly posts “Pottermore: First impressions of the new interactive Harry Potter Site“ .. I guess they got a Magical Quill.

Lucky buggers.

The Cloud: A View from Above – Private Cloud and the Hybrid Evolution

As if the term “Cloud Computing” wasn’t already severely overloaded, terms within the overarching technologies are even more overloaded, and many are quite misunderstood.

In this post (and in others in this series), I’m going to try to clarify a few of the definitions, and the subtle differences between various definitions as they are used. I’ll cover the Private Cloud and the Hybrid Cloud (links to Wikipedia, but my thoughts follow):

Private Cloud: In short, a Private Cloud is a cloud where the data access is restricted to specific users, typically within the same organization (or company) and behind a corporate firewall. Beyond the basic advantages of Cloud Computing (reduced IT infrastructure costs and management, “always up”, increased business and IT agility), there are several business reasons for keeping data in a Private Cloud:

  • Your applications store customer data containing Personally-Identifiable Information (PII), which could incur legal or financial risks if compromised.
  • Your application manages e-commerce transactions, credit card numbers, shipping addresses, etc.
  • You store corporate-owned, sensitive, mission-critical or proprietary data.

In these (and in many other) cases, the knee-jerk reaction of IT and Business is to keep these applications and data on-premises, safe behind the corporate firewall. In some organizations it may be difficult to argue against this mindset, but there are alternatives that enable businesses to enjoy the basic benefits of Cloud Computing in a secure manner.

A Private Cloud typically begins life as an application or services deployed in an on-premises data center. Access to the data (Authorization, Authentication and Accounting, commonly known as the AAA Protocol) is clearly defined and controlled by local IT resources. On-premises users can get to the application over their LAN, external users can use IPsec or VPN protocols to access the application securely from outside.

Now, with proper security (AAA over secure IP protocols, as noted above) a Private Cloud can exist in a vendor data center, provided the organization utilizes the same security protocols and IT controls as they would for an on-premises deployment. The rub? Well, read the news (link to a Bing search for the latest .. there’s always more). Suffice to say: many enterprises want absolute assurances data held away from their premises will be secure.

That said, it’s not that simple. Beyond advanced and highly-controlled access security, there are a few other bits and pieces that a hosted Private Cloud (one that is hosted at a vendor data center) would need to navigate:

  • Privacy: Monitoring, monitoring, monitoring. No, not performance monitoring. The monitoring to which I refer applies to communications in and out of a Private Cloud, based on the widely-discussed “NSA has massive database of Americans’ Phone Calls” (link to USA Today) story that broke a while back. Maybe the data itself isn’t directly accessible, but inferences about how the data is being used can be captured. This isn’t just a Cloud issue, by the way; vendors and enterprises will experience these challenges; hosted or on-premises.
  • Compliance: contractual and financial assurances (read: protections and remedies) that can be activated should a vendor fail to assume the risk of protecting the data using recognized practices and protocols. Note: this requirement brings with it a handy-dandy audit cycle that a vendor must also navigate.
  • Legal durability: last I checked, a subpoena is durable (a court order for information that stands up up nicely in the courtroom) should a governing body (State or Federal) “request” (quotes are mine) data from a non-enterprise-owned data center. A vendor would surrender the data without many questions. An enterprise would consider their options. Enterprises will consult in-house counsel before releasing data.

This is why enterprises will tend to run scared of deploying content in a non-enterprise-owned data center. Can you blame them? Before we find ourselves in the courtroom, let’s discuss for a bit. The logical evolution is not necessarily to avoid hosted private clouds, but to evaluate the content stored in on- and off-premises data centers. In this exercise, an Enterprise will identify types of data, including sensitive data (this is a short list):

  • Static public content (easily hosted in CDNs worldwide .. icons, static “about” pages, legal pages, etc.).
  • Some dynamic content that needs to be available to the public (and therefore, will need to scale, or be redirected to public, scalable resources) .. calendar- or location-based query results, catalogs or pricing data (updated via business rules), and so on.
  • Other dynamic content that needs to be held securely, and exposed only during relevant need. This can include PII, Credit Card, Customer status, and much more. In fact, some of these data need not be exposed at all; rather, secure queries to an internal system can yield responses that let the application get what it needs without viewing the actual data (querying if a token to a credit card account has sufficient balance, or confirming a shipping address via an encrypted form post).
  • Mission-critical data that has explicitly-defined audiences and uses.

Avoiding the issue of publicly-available data (the first two bullets, above), we raise several questions for Enterprises regarding sensitive data:

  • Are there ways that an Enterprise can protect sensitive data in an Internet paradigm?
  • How should an Enterprise control access to sensitive data by authorized entities?
  • How can an Enterprise protect sensitive and mission-critical data?

In this post, I am not proposing the answers. Not yet, anyway. I am, however, posing questions an Enterprise should ask. For starters:

  • Perform an analysis and inventory of systems, audiences and security requirements.
  • Prioritize systems based on business need and expected life; consider replacing, rewriting or redirecting system assets based on audiences, expected life and other factors.
  • Create a project plan with clear (and widely-publicized) milestones so the enterprise is aware of progress and potential impacts to system availability.

In this exercise, you will discover your enterprise is describing an evolution of establishing secure access to assets residing in a local data center or in a Private Cloud. The analysis will further suggest certain assets be addressed in another logical paradigm: the Hybrid Cloud. So, let’s talk about the Hybrid Cloud. My thoughts follow:

Hybrid Cloud: Loosely stated, a Hybrid Cloud consists of data and services held in on- and off-premises facilities, with access to sensitive data secured by VPN and IPsec protocols. Consider a company who stores customer address data in their local data center, under the physical control of their Enterprise IT. IT enables access FROM public resources (catalog and shipping sites) via secure protocols.

Here lies the objective of this post: in considering the evolution from Private to Hybrid, Enterprise will arrive at the fact some data must reside under the control of on-premises IT .. control over these bits will include the questions above. That said, I am not suggesting (extraction of any suggestions are at the risk and responsibility of the affected parties) that Enterprises expose their data to the world at large, without adequate (and tested) protections.

Solutions? Yah. Lots:

  • Windows Azure offers a the AppFabric Service Bus, a component that provides endpoint security .. a paradigm where secure connectivity is maintained by connecting applications to single points of access to other components. Disparate applications can connect to a single endpoint, simplifying and securing Hybrid Cloud components.
  • Amazon Web Services offers the Amazon Virtual Private Cloud (VPC), which enables an enterprise to launch a private and isolated section of AWS in a user-defined virtual network.
  • VMWare offers their vCloud product which enables enterprises to deploy workloads on shared infrastructure with built-in security and role-based access controls.

In these three cases (and there are others), Out-of-Cloud access can be enabled via IPsec and VPN. Your mileage may vary widely, depending on the analysis of your infrastructure and mapping this analysis against your requirements.

I do not intend this to be a pitch for deploying a Hybrid Cloud. However, I do suggest that enterprises consider and weigh their options when identifying the types of data that should be hosted on-premises, versus a trusted vendor.

Want to know more? Please read my collection of Cloud Computing posts, or reach out to me for more detail.