Articles

Affichage des articles du décembre, 2008

December NCOIC Plenary Presentations

Presentations from the NCOIC Cloud Computing sessions held earlier this month have been posted on-line in the Federal Cloud Computing wiki. The event featured speakers from IBM, Cisco, Microsoft, HP, and Salesforce. Access is free, but registration is required. Cloud Computing for Net-Centric Operations NCOIC - Bob Marcus (Leader NCOIC Cloud Computing Team). Slides ."Cloud Computing for Net-Centric Operations" HP - Shawn Sami (Chief Technologist, HP Software Federal). Slides ."Secure Cloud Computing and RACE" Salesforce.com - Peter Coffee (Director of Platform Research). Slides ."Building Mission-Critical SaaS Applications" IBM - David Lindquist (Cloud Computing Chief Architect and IBM Fellow). Slides."IBM's Perspective on Cloud Computing" Cisco - Glenn Dasmalchi (Chief of Staff, Cisco CTO Office). Slides . - James Urquhart (Marketing Manager for Cloud Computing)"Cloud Computing: Trends and Opportunity" Microsoft - Stan Freck (Dire

Booz|Allen|Hamilton Launches "Government Cloud Computing Community"

Image
As a follow-up to a Washington, DC Executive Summit event, BoozAllenHamilton recently launched an on-line government cloud computing collaboration environment. In an effort to expand the current dialog around government cloud computing, the strategy and technology consulting firm wants to build a community "to exchange ideas, share lessons learned, and generally support the government’s progress in leveraging Cloud Computing." The current topics listed on the site are themselves very interesting and include: Cloud Computing Case Studies; Cloud Economic Models; Enterprise Architecture; and Cloud Computing War Game Welcome aboard BAH! I expect that the mere presence of your site will heighten interest within the Federal marketplace on cloud computing. For additional information or to join the community, send an email to cloudcomputing@bah.com

Is Google Losing Document?

John Dvorak posted this question on his blog Saturday and as of Sunday evening had 52 responses! This is not a good thing for building confidence in cloud computing. Or is it? The story is that users of Google Docs were receiving the message “Sorry! We are experiencing technical difficulties and cannot show all of your documents.” Apparently the document were restored by Saturday evening, but this incident just reinforces two points: Just like any other human enterprise, cloud computing isn't infallible; and The Google cloud was apparently able to restore all the documents In the thread it seems that Google Docs was down for one day. Not knowing what happened, this recovery seems to have been just as good as any other global enterprise. So what's the beef with cloud computing? I don't see any. I do, however, see a problem with the document recovery delay. The issue is that a cloud service should be designed in a way that makes such failures invisible to the users. This is

Cryptographic Data Splitting? What's that?

Cryptographic data splitting is a new approach to securing information. This process encrypts data and then uses random or deterministic distribution to multiple shares. this distribution can also include fault tolerant bits, key splitting, authentication, integrity, share reassembly, key restoration or decryption. Most security schema have one or more of the following drawbacks: Log-in and password access often does not provide adequate security. Public-key cryptographic system reliance on the user for security. Private keys stored on a hard drive that are accessible to others or through the Internet. Private keys being stored on a computer system configured with an archiving or backup system that could result in copies of the private key traveling through multiple computer storage devices or other systems Loss or damage to the smartcard or portable computing device in biometric cryptographic systems Possibility of a malicious person stealing a mobile user's smartcard or portable

Nokia Is The New Blackberry Of The Emerging Countries

Nokia announced mobile email service, Mail on Ovi, currently targeting the emerging markets . Nokia has had great success in selling reliable and inexpensive handsets in the emerging markets. In the countries such as India the consumers never used the voice mail on their landlines and went through the mobile revolution to use SMS as a primary asynchronous communication medium. Many of these users are not active email users, not at least on their mobile devices. If Nokia manages to provide ubiquitous user experience using Ovi to bridge email and SMS on not-so-advanced-data-networks it can cause disruption by satisfying asynchronous communication needs of hundreds of thousands of users. The smartphones would certainly benefit out of this offering and give Blackberry a good run for their money. Nokia completed the Symbian acquisition that makes them a company whose OS powers 50% of all the smartphones in the world. Symbian is still a powerful operating system powering more than 200 milli

Now really. Should the Obama administration use cloud computing?

Image
It's amazing what a little radio time will do! Since Sunday's broadcast, I've been asked numerous times about my real answer to the question " Will 'Cloud Computing' Work In White House ". Although I would never assume to be in a position to advise the President-elect, I'm more than happy, however, to add my voice to the Center for Strategic and International Studies (CSIS) and the distinguished list of contributors that recently released the CSIS Commission on Cybersecurity for the 44th Presidency . I truly believe that cloud computing technology can be used to implement some of their recommendations. One in particular is their recommendation for a National Office for Cyberspace (NOC) and a new National Security Council Cybersecurity Directorate (NSCCD). Along with the relevant agencies, these organizations would: "Assume expanded authorities, including revised Federal Information Security management Act (FISMA) authorities, oversight of the Tru

NPR "All Things Considered" considers Government Cloud Computing

Image
My personal thanks to Andrea Seabrook , Petra Mayer and National Public Radio for their report  " Will 'Cloud Computing' Work In White House?" on today's   " All Things Considered" . When I started this blog ther e was doubt about cloud computing being anything but a fad that would just disappear in a few weeks. Now it's clear that an important dialog is underway on the merits of this new approach for the Federal government.  I look forward to continuing the dialog and as always welcome your comments.

De-coupled Cloud Runtime And Demand-based Pricing Suggest Second Wave Of Cloud Computing

A couple of days back Zoho announced that the applications created using Zoho Creator can now be deployed on the Google cloud . On the same day Google announced their tentative pricing scheme to buy resources on their cloud beyond the free daily quota. We seem to have entered into the second wave of the cloud computing. Many on-demand application vendors, who rely on non-cloud based infrastructure, have struggled to be profitable since the infrastructure cost is way too high. These vendors still have value-based pricing for their SaaS portfolio and cannot pass on the high infrastructure cost to their customers. The first wave of the cloud computing provided a nice utility model to the customers who wanted to SaaS up their applications without investing into the infrastructure and charge their customers a fixed subscription. As I observe the second wave of the cloud computing a couple of patterns have emerged. Moving to the cloud, one piece at time: The vendors have started moving the

HP Brings EDS Division into it's cloud plans

The Street reported earlier this week that Hewlett Packard's EDS division has won a $111 million contract with the Department of Defense (DoD) that could eventually support the U.S. military's cloud-computing efforts. EDS confirmed it will work with DISA officials to conduct security reviews on DoD networks, databases, systems and applications. It will also evaluate DoD security policies, which could further boost the department's cloud strategy. Even though security is universally seen as a key barriers for cloud adoption, this move seems to indicate a willingness on DISA's part to work through the issues that can support eventual widespread cloud deployments. Since EDS also has the Navy-Marine Corps Internet (NMCI) contract through 2010 , the hard won experience that EDS has gained from that painful chapter could possibly be leveraged to accelerate cloud services to the desktop. It could also give HP a leg up on the upcoming Navy Consolidated Afloat Networks and Ente

Cloud Computing and the Process Integration Era

Image
The Industry Advisory Council (IAC) is a non-profit, non-partisan organization dedicated to fostering improved communications and understanding between government and industry. through its affiliation with the American Council for Technology (ACT), IAC provides a forum for industry to collaborate with and advise government executives on IT issues. In fulfilling this role, the ACT-IAC Transition Study Group recently released a paper titled " Returning Innovation to the Federal Government with Information Technology ". Since the Obama's administration stated goals include the "use [of] technology to create a more transparent and connected democracy" and the employment of technology "to solve our nation's most pressing problems", this groups recommendations should certainly be considered. For this audience, their statements about information technology creating two major categories for performance breakthroughs in government bears attention. Accordin

The Tactical Cloud

When cloud computing first came in vogue, there was a rather serious discussion about the private cloud concept. The whole idea of cloud computing seemed to argue against implementing such a capability behind organizational walls. Although in some circles, the idea of a "private cloud" is being subsumed by the more acceptable "enterprise cloud", last week's discussions at the Network Centric Operations Industry Consortium brought up a different cloud concept - the "tactical cloud". Now before you shout foul, please hear me out. First of all, the discussion was centered on how the US Department of Defense (DoD) could possibly leverage cloud computing technology. In the view of many, the development of a "private" or "enterprise" cloud for the DoD is a fait accompli. Besides, the DoD has multiple private internets (NIPRnet, SIPRnet, JWICS, etc.) so private clouds seem like an appropriate evolution. Enterprise clouds, however, seemed

"Cloud Musings" Now on SYS-CON Media "Cloud Computing Journal" !!

I'm happy to announce that a recent "Cloud Musings" article, " Commercial vs Federal Cloud Computing " has been reposted on SYS-CON Media's " Cloud Computing Journal ". Thank you SYS-CON for making selected "Cloud Musings" articles available to your audience. I am honored by your support and look forward to providing my personal insights to your readers. SYS-CON Media, founded in 1994, is widely recognized in the Internet-technology and magazine publishing industries as the world's leading publisher of i-technology magazines, electronic newsletters, and accompanying i-technology breaking news, education and information Web portals. Cloud Computing Journal is their publication targeting the cloud computing community.

How to make clouds interoperable and standard !!

This has been a huge part of my life over the past few weeks! This is my personal view. WARNING: DON'T EXPECT THE ANSWER TO BE FOUND BELOW !!! There are three basic approaches to this issue being aggressively discussed right now: “Standards Body” Approach “Adopt Proven Technology” Approach “Customer Driven” Approach All three approaches have value and all three have their problems, including: Global agreement on standards could be a long process “Proven Technology” may mean “Proprietary Technology” Multiple industry customers could result in industry linked standards for cloud So what is an embryonic industry to do? A hybrid of course !! Options include, but are not limited to: Release all details of a selected “Proven Technology” to industry and adopt as a basis for an open standard Embrace multiple standards each set optimized for industry ROI models “Interoperability Rating “ issued after standards body technical review (And we've only just begun)

The Tension between Public and Private Clouds

Last week, during discussion on cloud interoperability and standards in Israel, I saw for the first time a real dichotomy in the value of public (external) and private (internal) clouds. This tension seemed to arise from the fact that an CIOs considering moving applications to an outside cloud vendor, would probably set their highest priority on legacy applications. The logic was that since it was more costly to maintain older applications internally, moving those applications to the cloud would represent a high value option. This customer high value option, however, seemed to present a worst case success scenario for public cloud providers. Is this true? The general lack of internal metering for applications would also make an Internal vs. External ROI business case a fairly difficult task. Could these tensions actually lead to different business models for internal and external cloud purveyors?

Cloud Computing for Continuity of Operations (COOP)

Recently, I've been focusing on cloud computing for COOP. The way I looked at it, many government agencies are already using commercial shared facilities as COOP sites and that the cloud simply represented a virtual shared facility. Although there are still many security, privacy and policy issues that need to be addressed, it seems to me that cloud computing could still provide a cost effective and efficient COOP capability for certain operational and mission subsets. A major key to success would be in the identification of which non-critical applications and infrastructure resources an agency could migrate to a cloud platform. By definition, these would be resources the agency could go without for two days. In general, applications, storage and server resources related to ad-hoc projects, research and development efforts, and certain peak-time requirements could clearly be considered. To ensure operational and contractual flexibility, only solutions that could work across multipl

NCOIC Plenary Session

Hopping a plane to the west coast today to attend the NCOIC Plenary in Costa Mesa, California. First day "Cloud Computing for Net-Centric Operations" agenda includes: David Ryan, Chief Architect HP Federal Technology Solution Group-"Secure Cloud Computing" Peter Coffee, Salesforce.com Director of Platform Research- "Building Mission-Critical SaaS Applications" David Lindquist, IBM Cloud Computing Chief Architect and IBM Fellow - "IBM's Perspective on Cloud Computing" Glenn Dasmalchi, Chief of Staff, Cisco CTO Office - "Cloud Computing: Trends and Opportunity" Stan Freck, Microsoft Director, Software + Services – US Public Sector- "Software + Services – the next generation of computing" The second day team session are on Enterprise Cloud Computing and Cloud Computing for Tactical Networks and Rapid Deployment. Briefings are expected from Stuart Charlton (Elastra), Daniel Nurmi (Eucalytus) Bert Armijo (3Tera) and someone

Dataline named "Top 100 Cloud Computing Company"

Image
SYS-CON's Cloud Computing Journal included Dataline in its expanded list of the most active players in the cloud ecosystem. In adding Dataline to the "Top 100" list , Jeremy Geelan noted that "...the role this company fills as a mid-level Federal System Integrator is crucial to the adoption of these technologies by the public sector". In a related announcement, Dataline was also named a " Cloud Computing Technology Contributors of 2009 ". Jeremy Geelan is Sr. Vice-President of SYS-CON Media & Events. He is Conference Chair of the International Cloud Computing Conference & Expo series and founder of Cloud Computing Journal . He is also executive producer and presenter of "Power Panels with Jeremy Geelan" on SYS-CON.TV. Thank you Jeremy for including Dataline.

Autoscaling into the cloud- Good or Bad?

I always thought saw the ability to autoscale into a cloud infrastructure as a good thing. George Reese presented a differing view on the O'Reilly blog recently. "Auto-scaling is the ability (with certain cloud infrastructure management tools like enStratus —in a limited beta through the end of the year) to add and remove capacity into a cloud infrastructure based on actual usage. No human intervention is necessary. It sounds amazing—no more overloaded web sites. Just stick your site into the cloud and come what may! You just pay for what you use. But I don't like auto-scaling." While I agree with the need for an enterprise to do capacity planning, I think that the dicussion goes far beyond an overloaded website. I believe that the real value of autoscaling lies in the support of a service oriented architecture (SOA), especially when services are auto-discovered and workflows are created on the fly with mash-ups.

Cloudera must be reading the script!

"Cloud computing leapt out as the most obvious way to address enterprise large data problems" - Ken Pierce, IT Specialist, DIA-DS/C4ISR "We view Hadoop as the key enabler...[in] optimizing the [cloud infrastructure] platform to ingest and present information effectively in the petascale." - Robert Ames, Director & Deputy CTO, IBM Federal Successful mission accomplishment in the DoD, DHS and Intelligence Communities revolve around their ability to process "Big Data". Hadoop is all about processing "Big Data". The ability to process big data is crucial to mission accomplishment because this is the core technology for processing terabyte-sized datasets with on-line applications. This capability is also needed to enable low-latency in automated decision tools. Since a typical software engineer has never used a thousand machines in parallel to process a petabyte of data, new software tools are critical to the sucessful implementation of solutions

Incomplete Framework Of Some Different Approaches To Making Stuff

Steve Portigal sent me an article that he wrote in the Interactions magazine asking for my feedback. Unfortunately the magazine is behind a walled garden and would require a subscription but if you reach out to Steve he should be able to share the article with you. In the absence of the original article I will take liberty to summarize it. Steve has described how companies generally go about making stuff in his “incomplete” framework: Be a Genius and Get It Right: One-person show to get it right such as a vacuum cleaner by James Dyson. Be a Genius and Get It Wrong: One-person show to get it wrong such as Dean Kamen’s Segway. Don’t Ask Customers If This Is What They Want: NBA changing the basketball design from leather to synthetic microfiber without asking the players Do Whatever Any Customer Asks: Implementing the changes as requested by the customers exactly as is without understanding the real needs. Understand Needs and Design to Them: Discovery of the fact that women shove

Animoto = Automated Imagery PED

Over the past two days, I've spent quite a bit of time with Stevie Clifton, Co-founder & CTO of Animoto . Besides being one of the coolest things I've seen in years, Animoto is giving us a glimpse of automated imagery PED (Processing, Exploitation, Dissemination). First an introduction. Animoto Productions, a self described "bunch of techies and film/TV producers who decided to lock themselves in a room together and nerd out" have released a web application that automatically generates professionally produced videos. The site uses their patent-pending technology and high-end motion design to fully customized orchestration of user-selected images and music. By using Cinematic Artificial Intelligence technology, "it analyzes and combines user-selected images and music with the same sophisticated post-production skills & techniques that are used in television and film." Their AWS spike story is now famous in the cloud computing computing. Now let's

World Summit of Cloud Computing 2008

Video by Animoto using cloud computing technology. (Done in 20 minutes for free)!!

Does Cloud Computing Help Create Network Effect To Support Crowdsourcing And Collaborative Filtering?

Nick has a long post about Tim O'Reilly not getting the cloud . He questions Tim's assumptions on Web 2.0, network effects, power laws, and cloud computing . Both of them have good points. O'Reilly comments on the cloud in the context of network effects: "Cloud computing, at least in the sense that Hugh seems to be using the term, as a synonym for the infrastructure level of the cloud as best exemplified by Amazon S3 and EC2, doesn't have this kind of dynamic." Nick argues: "The network effect is indeed an important force shaping business online, and O'Reilly is right to remind us of that fact. But he's wrong to suggest that the network effect is the only or the most powerful means of achieving superior market share or profitability online or that it will be the defining formative factor for cloud computing." Both of them also argue about applying power laws to the cloud computing. I am with Nick on the power laws but strongly disagree with