Day two of the Gartner Data Center Conference in Las Vegas continued reinforcing old topics, appearing at times to be either enlist attendees in contributing to Gartner research, or simply providing conference content directed to promoting conference sponsors.
For example, sessions “To the Point: When Open Meets Cloud” and “Backup/Recovery: Backing Up the Future” included a series of audience surveys. Those surveys were apparently the same as presented, in the same sessions, for several years. Thus the speaker immediately referenced this year’s results vs. results from the same survey questions from the past two years. This would lead a casual attendee to believe nothing radically new is being presented in the above topics, and the attendees are generally contributing to further trend analysis research that will eventually show up in a commercial Gartner Research Note.
Gartner analyst and speaker on the topic of “When Open Meets Clouds,” Aneel Lakhani, did make a couple useful, if not obvious points in his presentation.
- We cannot secure complete freedom from vendors, regardless of how much you adopt open source
- Open source can actually be more expensive than commercial products
- Interoperability is easy to say, but a heck of a lot more complicated to implement
- Enterprise users have a very low threshold for “test” environments (sorry DevOps guys)
- If your organization has the time and staff, test, test, and test a bit more to ensure your open source product will perform as expected or designed
However analyst Dave Russell, speaker on the topic of “Backup/Recovery” was a bit more cut and paste in his approach. Lots of questions to match against last year’s conference, and a strong emphasis on using tape as a continuing, if not growing media for disaster recovery.
Problem with this presentation was the discussion centered on backing up data – very little on business continuity. In fact, in one slide he referenced a recovery point objective (RPO) of one day for backups. What organization operating in a global market, in Internet time, can possibly design for a one day RPO?
In addition, there was no discussion on the need for compatible hardware in a disaster recovery site that would allow immediate or rapid restart of applications. Having data on tape is fine. Having mainframe archival data is fine. But without a business continuity capability, it is likely any organization will suffer significant damage in their ability to function in their marketplace. Very few organizations today can absorb an extended global presence outage or marketplace outage.
The conference continues until Thursday and we will look for more, positive approaches, to data center and cloud computing.
Federal, state, and local government agencies gathered in Washington D.C. on 16 February to participate in Cloud/Gov 2012 held at the Westin Washington D.C. With Keynotes by David L. McLure, US General Services Administration, and Dawn Leaf, NIST, vendors and government agencies were brought up to date on federal cloud policies and initiatives.
Of special note were updates on the FedRAMP program (a government-wide program that provides a standardized approach to security assessment, authorization, and continuous monitoring for cloud products and services) and NIST’s progress on standards. “The FedRAMP process chart looks complicated” noted McLure, “however we are trying to provide support needed to accelerate the (FedRAMP vendor) approval process.
McLure also provided a roadmap for FedRAMP implementation, with FY13/Q2 targeted for full operation and FY14 planned for sustaining operations.
In a panel focusing on government case studies, David Terry from the Department of Education commented that “mobile phones are rapidly becoming the access point (to applications and data) for young people.” Applications (SaaS) should be written to accommodate mobile devices, and “auto-adjust to user access devices.”
Tim Matson from DISA highlighted the US Department of Defense’s Forge.Mil initiative providing an open collaboration community for both the military and development community to work together in rapidly developing new applications to better support DoD activities. While Forge.Mil has tighter controls than standard GSA (US General Services Administration) standards, Matson emphasized “DISA wants to force the concept of change into the behavior of vendors.” Matson continued explaining that Forge.Mil will reinforce “a pipeline to support continuous delivery” of new applications.
While technology and process change topics provided a majority of discussion points, mostly enthusiastic, David Mihalchik from Google advised “we still do not know the long term impact of global collaboration. The culture is changing, forced on by the idea of global collaboration.”
Other areas of discussion among panel members throughout the day included the need for establishing and defining service level agreements (SLAs) for cloud services. Daniel Burton from SalesForce.Com explained their SLAs are broken into two categories, SLAs based on subscription services, and those based on specific negotiations with government customers. Other vendors took a stab at explaining their SLAs, without giving specific examples of their SLAs, leaving the audience without a solid answer.
NIST Takes the Leadership Role
The highlight of the day was provided by Dawn Leaf, Senior Executive for Cloud Computing with NIST. Leaf provided very logical guidance for all cloud computing stakeholders, including vendors and users.
“US industry requires an international standard to ensure (global) competitiveness” explained Leaf. In the past US vendors and service providers have developed standards which were not compatible with European and other standards, notably in wireless telephony, and one of NIST’s objectives is to participate in developing a global standard for cloud computing to prevent this possibility in cloud computing.
Cloud infrastructure and SaaS portability is also a high interest item for NIST. Leaf advises that “we can force vendors into demonstrating their portability. There are a lot of new entries in the business, and we need to force the vendors into proving their portability and interoperability.”
Leaf also reinforced the idea that standards are developed in the private sector. NIST provides guidance and an architectural framework for vendors and the private sector to use as reference when developing those specific technical standards. However leaf also had one caution for private industry, “industry should try to map their products to NIST references, as the government is not in a position to wait” for extended debates on the development of specific items, when the need for cloud computing development and implementation is immediate.
Further information on the conference, with agendas and participants is available at www.sia.net
On October 20th, Bill Reidway, Vice President of Numbering Services Product Management at Neustar blogged on the topic of number portability, and why it is important to both the telecom industry and end users. As manager of the National Portability Administration Center (NPAC), Neustar connects more than 2000 carriers in North America, supporting user ability to change carriers without changing their phone number, and seamlessly routing calls between all carriers regardless of the original source of individual or blocks of phone numbers.
Pacific-Tier Communications interviewed Reidway with the intent to learn more about Neustar’s activities with the NPAC, as well as dig a bit deeper into the company’s vision on the future of telephony, telephone numbers, and communications.
Origins of the NPAC
According to Reidway, administration of the NPAC has continued to change since local number portability was mandated as part of the Telecommunications Act of 1996. Neustar has managed the NPAC program since 1997, with changes along the way including addition of wireless network portability, internodal portability, and most recently in 2007, VoIP carrier portability.
Reidway is convinced telephone numbers and telephone carriers have a good future. While many talk about the potential of peer-to-peer technologies, such as Skype, as the future of communications, Reidway strongly believes the need for telephone numbers remains unabated. “Even Skype needs to connect to the PSTN (public switched telephone network) to provide a meaningful user experience” noted Reidway. “Bypassing the telephone number is still an exception to the rule.”
While emphasizing the existing TDM networks offer a great deal of control, particularly in terms of cutting down unwanted telephony traffic, Reidway cautions the IP telephony world is still a bit like the wild, wild, west, raising challenges in security, load balancing, and network authorization. “Neustar has to keep up with technology” continued Reidway, explaining the telecom industry has made the decision to support Internet protocols (IP). He uses the cable industry as an example of carriers running “all” IP telephony networks.
Decline of the Fixed Line Network
It is clear fixed line telephone services in the United States are beginning a rapid decline, with users favoring mobile phones and computer-enabled telephony. Reidway fully appreciates the dynamics of user migrations and mobility, assuring the NPAC is not constrained by the “vagaries” associated with fixed-line networks and location. “As the fixed line network begins to fall by the wayside” explained Reidway, “the notion of telephone numbers associated with a specific geography falls with it.”
Reidway also explained that although telephone numbers no longer have rigid location sensitive significance, users still generally prefer to associate their phone numbers with a location, and that is particularly important for business users. While it is certainly possible for a business or individual to use an area code, or even country code from any point in the world, he believes an area code “still says something about the identity behind the number.”
A Peek into the Future
Neustar currently has no specific plan to change NPAC’s operations, as carriers understand there are still ample supplies of telephone numbers available to support new numbers, possibly for several decades into the future. With additional opportunities through number pooling (in 2000 the FCC allowed smaller carriers with large amounts of unused telephone numbers to contribute those excess resources to a common number resource pool for distribution to other carriers in need of additional numbers), North America has sufficient numbers to last at least several decades.
When asked of the potential of individuals, businesses, and even objects such as refrigerators all being able to tag an identity to an IPv6 address, with all potential modes of communication ultimately finding a way to that identity, Reidway understands the question. The issue, and the very long term significance, are a very important discussion, one which Reidway is prepared to engage.
The communications and network-enabled global community are changing quickly to meet the needs of existing and new users. Infrastructure shortfalls in many locations around the world which have historically throttled citizens from being able to join the the global community are now being reinforced, allowing nearly every point of the world some level of access to the Internet, long before most are able to secure a fixed line telephone.
Impact of Peer-to-Peer
As of September 2011 Skype claims more than 660 million registered users, nearly 1/8th of the world’s population, representing more than 190 billion minutes of non-telephony, unpaid communications, with 13% of those minutes bypassing international carriers.
As the concept of interpersonal communications continues to morph into a form which may not be easily envisioned today, Nuestar, with additional services such as domain name and registry services, IP geolocation, and IP translation/mapping services such as ENUM, Reidway maintains confidence Neustar and the NPAC have both flexibility and resources to ensure North American carriers, users, and networks are not caught short in the global move to Internet-enabled multi-media and communication services.
Reidway concluded “we have the experience and capability to help any transition to new technologies and emerging forms of communication.”
You can read all of Reidway’s blogs at Neustar Insights, and comment on his ideas, visions, and support of the North American communications community.
NOTE: Pacific-Tier Communications LLC is not affiliated with Neustar or the NPAC. This interview and article are intended to inform readers of the NPAC, and some of the thought leaders responsible for managing and developing infrastructure needed to keep the US and North American competitive in the global market and community.
With dozens of public cloud service providers on the market, offering a wide variety of services, standards, SLAs, and options, how does an IT manager make an informed decision on which provider to use? Is it time in business? Location? Cost? Performance?
Pacific-Tier Communications met up with Jason Read, owner of CloudHarmony, a company specializing in benchmarking the cloud, at Studio City, California, on 25 October. Read understands how confusing and difficult it is to evaluate different service providers without an industry-standard benchmark. In fact, Read started CloudHarmony based on his own frustrations as a consultant helping a client choose a public cloud service provider, while attempting to sort through vague cloud resource and service terms used by industry vendors.
“Cloud is so different. Vendors describe resources using vague terminology like 1 virtual CPU, 50 GB storage. I think cloud makes it much easier for providers to mislead. Not all virtual CPUs and 50 GB storage volumes are equal, not by a long shot, but providers often talk and compare as if they are. It was this frustration that led me to create CloudHarmony” explained Read.
So, Read went to work creating a platform for not only his client, but also other consultants and IT managers that would give a single point of testing public cloud services not only within the US, but around the world. Input to the testing platform came from aggregating more than 100 testing benchmarks and methodologies available to the public. However CloudHarmony standardized on CentOS/RHEL Linux as an operating system which all cloud vendors support, “to provide as close to an apples to apples comparison as possible” said Read.
Customizing a CloudHarmony Benchmark Test
Setting up a test is simple. You go to the CloudHarmony Benchmarks page, select the benchmarks you would like to run, the service providers you would like to test, configurations of virtual options within those service providers, geographic location, and the format of your report.
Figure 1. Benchmark Configuration shows a sample report setup.
“CloudHarmony is a starting point for narrowing the search for a public cloud provider” advised Read. “We provide data that can facilitate and narrow the selection process. We don’t have all of the data necessary to make a decision related to vendor selection, but I think it is a really good starting point.
Read continued “for example, if a company is considering cloud for a very CPU intensive application, using the CPU performance metrics we provide, they’d quickly be able to eliminate vendors that utilize homogenous infrastructure with very little CPU scaling capabilities from small to larger sized instance.”
Cloud vendors listed in the benchmark directory are surprisingly open to CoudHarmony testing. “We don’t require or accept payment from vendors to be listed on the site and included in the performance analysis” mentioned Read. “We do, however, ask that vendors provide resources to allow us to conduct periodic compute benchmarking, continual uptime monitoring, and network testing.”
When asked if cloud service providers contest or object to CloudHarmony’s methodology or reports, Read replied “not frequently. We try to be open and fair about the performance analysis. We don’t recommend one vendor over another. I’d like CloudHarmony to simply be a source of reliable, objective data. The CloudHarmony performance analysis is just a piece of the puzzle, users should also consider other factors such as pricing, support, scalability, etc.”
During an independent trial of CloudHarmony’s testing tool, Pacific-Tier Communications selected the following parameters to complete a sample CPU benchmark:
- CPU Benchmark (Single Threaded CPU)
- GMPbench math library
- Cloud Vendor – AirVM (MO/USA)
- Cloud Vendor – Amazon EC2 (CA/USA)
- Cloud Vendor – Bit Refinery Cloud Hosting (CO/USA)
- 1/2/4 CPUs
- Small/Medium/Large configs
- Bar Chart and Sortable Table report
The result, shown above in Figure 2., shows a test result including performance measured against each of the above parameters. Individual tests for each parameter are available, allowing a deeper look into the resources used and test results based on those resources.
In addition, as shown in Figure 3., CloudHarmony provides a view providing uptime statistics of dozens of cloud service providers over a period of one year. Uptime statistics showed a range (at the time of this article) between 98.678% availability to 100% availability, with 100% current uptime (27 October).
Who Uses CloudHarmony Benchmark Testing?
While the average user today may be in the cloud computing industry, likely vendors eager to see how their product compares against competitors, Read targets CloudHarmony’s product to “persons responsible for making decisions related to cloud adoption.” Although he admits that today most users of the site lean towards the technical side of the cloud service provider industry.
Running test reports on cloud harmony is based on a system of purchasing credits. Read explained “we have a system in place now where the data we provide is accessible via the website or web services – both of which rely on web service credits to provide the data. Currently, the system is set up to allow 5 free requests daily. For additional requests, we sell web service credits where we provide a token that authorizes you to access the data in addition to the 5 free daily requests.”
The Bottom Line
“Cloud is in many ways a black box” noted Read. “Vendors describe the resources they sell using sometimes similar and sometimes very different terminology. It is very difficult to compare providers and to determine performance expectations. Virtualization and multi-tenancy further complicates this issue by introducing performance variability. I decided to build CloudHarmony to provide greater transparency to the cloud.”
And to both vendors and potential cloud service customers, provide an objective, honest, transparent analysis of commercially available public cloud services.
Check out CloudHarmony and their directory of services at cloudharmony.com.
During his opening keynote speech at ICEGOV 2011 in Tallinn, Estonia, President Toomas Hendrik Ilves highlighted efforts of the Estonia’s Cyber Defense League, an operational arm of the country’s National Defense League.
An all volunteer force, the Cyber Defense League acts as a national guard to protect Estonia from cyber attack, following the major assault on country in 2007 by Russian hackers.
“Our country encourages IT professionals to contribute to national defense as part-time members of our cyber national guard,” said Ilves, these are young people “who are motivated, patriotic, and think it (contributing to national defense) is pretty cool.”
Traditional Barriers to National Service Removed
Recruits entering their country’s national service, such as the army, normally follow a similar track. The first year of service provides an exercise in mental torture, mental strengthening, physical training, gathering skills to function in the infantry, and all the other training needed to bring a civilian into a basic level of competence for military service.
This standard routine serves to exclude individuals who may be far more interested in technology, academic pursuits, sciences, and to be honest, becoming serious network or software geeks. While there may be an argument that military organizations have become much better in their cyber-warfare capabilities, it can also be argued many of the best minds in a country are those developing technology systems, rather than super users.
Estonia, home of Skype and other global software initiatives, is harnessing the power of their intellectual resources in a positive way, which also promotes national security, pride, and patriotism.
Cyber Weekend Warriors
The Cyber Defense League (CDL) is a uniformed service, equal in stature and responsibility to other arms of the National Defense League. Recruits require security clearances, and are available for mobilization in the event of a national emergency – regardless of the nature of that emergency.
CDL members muster for weekend duty, exercises, and additional cyber security and warfare training.
Cooperation between private industry and national defense is much closer than in countries such as the US, where even during national emergencies commercial companies are rarely engaged in immediate cyber attack and response – at least not in full cooperation with the government or military. There may be representation in groups such as the CERT, however even those organizations generally act outside the scope of national defense.
In Estonia, now commercial companies and many of their employees are an inherent component of the national cyber defense.
… Be Cyber Strong
So, if we consider a model of supplementing national security by recruiting engineers, developers, and technicians in a single model location such as the Silicon Valley, train them to extend their skills to support national defense, complete a background check and offer a security clearance, what would the potential impact be on reinforcing our California or national Cyber Protection capacity?
Add more highly skilled engineers from other technology “industry cluster” states to the defense system, and it is highly probable that we will make great strides in further strengthening our local and national cyber defense.
Of course in the United States we do have to get over some additional concerns, such as suspicion among many in Internet and technology communities who may not fully trust the intentions of the government.
The burden is on the government to establish programs, develop a thought leadership campaign to build a sense of service and pride, and then fully embrace extremely motivated and intelligent IT professionals into the military community.
One of those programs is the DOD’s Defense Industrial Base Cyber Pilot, which allows the DoD to share some information with private enterprise regarding threats to security. However it is clearly a superficial attempt, and does not seek to actively engage those who potentially have the best skills to offer.
Estonia is a small country, struggling to break free of the social and institutional constraints imposed by nearly 70 years of Soviet and Nazi occupation, and economic restrains of a global recession. A country with a motivated workforce, and a need to protect all their national wired resource from theft, exploitation, and attack.
The Cyber Defense League is a very unique, and creative step to provide that security and protection.
We have all seen the videos of Rodney King’s beating in Los Angeles, Oscar Grant’s death at the BART station in Oakland, Anthony Graber’s arrest for videotaping his own arrest in Maryland, and other “caught on video” scenes with public officials behaving outside the law or violating the rights of citizens. The question brought before the US 1st Circuit Court of Appeals, is simple – is filming a public official in the performance of their duties a right guaranteed us under the 1st amendment of the US Constitution?
Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances. (1st Amendment, US Constitution)
On 26 August the US Court of Appeals issued their opinion, following a suit filed by the ALCU on behalf of Simon Glik, a citizen arrested on 1 October 2007 for filming a Boston Police take down of a man in Boston Commons. Glik was standing in a public location, at least 10 feet away from the officers, when suspicion of excessive use of force prompted him to film the incident with his cell phone. A Boston police officer challenged Glik on whether he was capturing audio with his film, and upon admitting he was, the officers arrested Glik under Massachusetts wiretap statute.
It is firmly established that the First Amendment’s aegis extends further than the text’s proscription on laws “abridging the freedom of speech, or of the press,” and encompasses a range of conduct related to the gathering and dissemination of information. As the Supreme Court has observed, “the First Amendment goes beyond protection of the press and the self-expression of individuals to prohibit government from limiting the stock of information from which members of the public may draw.”
…An important corollary to this interest in protecting the stock of public information is that “[t]here is an undoubted right to gather news ‘from any source by means within the law.’”
The filming of government officials engaged in their duties in a public place, including police officers performing their responsibilities, fits comfortably within these principles.
Gathering information about government officials in a form that can readily be disseminated to others serves a cardinal First Amendment interest in protecting and promoting “the free discussion of governmental affairs.” (Case 10-1764)
This is a major win for citizens, and citizen journalists. With “mainstream” media such as CNN recruiting “iReporters” for their broadcasts, it is a clear message to the world traditional journalists cannot adequately cover world events, and citizen journalists have a role in filling coverage gaps during rapidly evolving events.
While we can acknowledge police officers and public officials will not always warmly embrace embrace this decision, it is the law.
Why it is Important to Support Citizen Journalism and the Right to Record Events
National and local newspapers are rapidly closing due to either mismanagement (the owners did not see the radical changes prompted by the digital age), bankruptcy, or combinations of both dynamics. The “Newspaper Deathwatch” website highlights the major newspapers that have closed in the past five years, and those which have either announced their demise or change to an all digital format.
With the loss of newspapers, the media industry is also losing experienced reporters, creating major shortfalls in coverage of public events (such as city hall meetings, school board meetings, etc), as well as incidents and events occurring within the community (such as accidents, fires, weather-related news, etc).
Among the more well-known sites which have closed or changed formats are:
- Honolulu Advertiser (closed and merged with the Star-Bulletin)
- Rocky Mountain News
- Cincinnati Post
- Baltimore Examiner
- Seattle Post-Intelligencer
- Detroit News/Free press
- Christian Science Monitor
This means local news sources are becoming more scarce, depriving citizens of knowledge related to their local community. The digital age supports gathering and recovering that knowledge, primarily through user generated content. An emerging trend in news gathering and presentation is through hyper-local web sites focusing on individual communities or geographies. As many startup hyper-local media sources are self-funded or lacking ample startup funding, the editors and owners do rely on citizen generated content to provide news to readers.
The US Court of Appeals in their decision on recording public officials and police have fortunately accepted and understood the changing technologies and media environment, acknowledging citizens recording events are protected under the Bill of Rights, and those citizens are also protected from illegal arrest, search, or seizure of their media.
NOTE: Attempts to contact the public affairs/information officer (PIO) at several Los Angeles area police departments were unsuccessful. If the PIOs do eventually respond, we will update the blog with that response. This is not meant to degrade the professionalism or courage of police officers, rather it is meant to highlight citizen rights under the 1st and 4th amendments under the US Constitution’s Bill of Rights.