Intel invited a group of journalists to their Hillsboro Oregon campuses for a "Day in the Cloud" to talk about cloud computing. The big problem with the "cloud" is the lack of a commonly accepted definition – the 15-second kind, where you?re pitching your start-up ideas to a venture capitalist in an elevator.

Most companies want the cloud to make it easier for their employees to access data and applications so they can collaborate more effectively. One of the earliest collaboration tools was email. Along with the ability to download and upload files from "big iron," these became basic collaboration requirements for any enterprise IT department. Today, we often have live video conferencing with application and data files being accessed simultaneously across incredible distances.

Our "Day in the Cloud" started with Intel’s GM for High Density Computing and Cloud Services, Jason Waxman, dividing his cloud overview into four topics:

Security, Efficiency, Manageability, and Lock-in.

Waxman explained that by 2015, it is projected that 2.5 billion people with 10 billion devices will be accessing the Internet. To handle the cloud in the next four years, there needs to be an 8X increase in bandwidth, a 16X increase in storage, and a 20X in computing power.

In most people?s thinking, Security is the first item in their conversations about the cloud. The lack of security is what we journalists like to point out. Most organizations still control their own data locations, but that doesn?t negate security problems. Several examples of security failures when companies are not using the cloud and are in charge of their data are:

Kaiser Permanente, an HMO (Health Maintenance Organization) had to notify 29,500 of its Northern California employees that a security breach led to the release of their personal information, including social security numbers. Kaiser Permanente has a long history of having its patients’ confidential data comprised by security breaches. Apparently, Kaiser Permanente is not very good at protecting its employees’ confidential data, either.

Already this year, the New York City Health and Hospitals Corporation?s North Bronx Healthcare Network (HHC) reported a security breach with 1,700,000 persons affected.

Waxman suggested that when company data is in a secure off-site storage facility, they are less likely to have security breaches, because the storage provider is going to be governed by standards and want to have satisfied customers. He emphasized that the cloud computing industry along with governments must raise everyone’s confidence levels that as data moves from the desktop to the cloud, it stays secure and private.

To help convince skeptics, Intel has set up their Cloud Builder Program and assisted in starting the Open Data Center Alliance iniative. Waxman explained that a cloud user has to comply with their local regualtions no matter where the data and application actually reside. For example, in Europe there are strict laws and regulations about data privacy and where data can reside. Personal data for a European citizen cannot be moved outside the European Union.

Waxman said Intel has a lot of customers who want to do something in the cloud. But they want someone to show them "how-to". Intel has partnered with large vendors such as Cisco and their enterprise solutions. One of the demos we saw was Citrix moving a workload from Intel’s Cloud Builders data center in Folsom, California, which Intel treats as a private cloud, into the Hillsboro, Oregon, facility, which acts as the public cloud. Oxygen Cloud showed a service that lets end-users share and access corporate documents from devices such as smartphones and tablets while always with security and flexability.

Intel's Cloud Builders - Partner ecosystem

Billy Cox, Director of Intel’s Cloud builder program, explained they have developed over 25 reference architectures and expect to have another 25 by the end of the year. Cox explained that in the past, reference designs for data centers were very much a one-off approach. With standards for the ‘cloud approach’ they allow hardware and software designers to simplify designs. Thus, a cloud-based data center can be reconfigured without a ‘rip it all out and replace it’ approach – while maintaining the necessary security, efficiency, and manageability.

Intel has invested heavily in building test labs which allow a hardware and software developer to go from proof of concept to a fully developed system. This lets integrators, consultants, developers, service providers, and others evaluate their cloud designs on Intel architecture.

Intel Cloud Builders Reference Architectures

Intel’s competitors are showing their own reference designs. The ARM architecture is moving up the server food chain with their Cortex-A15 IP announcment. ARM-based server maker Calxeda and CPU designer Tilera have gotten Intel to think small too.

Intel MicroServer Reference Design: Xeon 3400 CPU-based system became the ide a for HP ProLiant MicroServer

We were shown Intel’s original micro-server reference design powered by a low-power version of its Xeon 3400 CPU. Intel?s reference design became the idea for the HP ProLiant – MicroServer.

SeaMicro’s new server uses low-power Intel Atom netbook chips. The SM10000-64 server packs "850GHz" of processing power and is targeted at data centers that handle a large volume of Internet transactions . Unfortunately, none of these servers were available for our "Day in the Cloud".

The cloud is a work in progress and needs the push towards standards and reference designs that Intel is providing. Intel was a gracious host in providing BSN* transporation and lodging last week. We are looking forward to following up with vendors we were introduced to at Hillsboro.

For all the known and unknown hurdles that lie ahead for cloud computing, one thing’s for sure: there’s plenty of interest in the concept, and no shortage of cheerleaders to move the ideas foward. For the end, we leave you with the event’s official video: