ACC 2013The 2013 ACC kicked off on Tuesday morning with an acknowledgement by Philippine Long Distance Telecommunications (PLDT) CEO Napolean L. Nazareno that “we’re going through a profound and painful transformation to digital technologies.”  He continued to explain that in addition to making the move to a digital corporate culture and architecture that for traditional telcos to succeed they will need to “master new skills, including new partnership skills.”

That direction drives a line straight down the middle of attendees at the conference.  Surprisingly, many companies attending and advertising their products still focus on “minutes termination,” and traditional voice-centric relationships with other carriers and “voice” wholesalers.

Philippe MilletMatthew Howett, Regulation and Policy Practice Leader for Ovum Research noted ”while fixed and mobile minutes are continuing to grow, traditional voice revenue is on the decline.”  He backed the statement up with figures including “Over the Top/OTT” services, which are when a service provider sends all types of communications, including video, voice, and other connections, over an Internet protocol network – most commonly over the public Internet.

Howett informed the ACC’s plenary session attendees that Ovum Research believes up to US$52 billion will be lost in traditional voice revenues to OTT providers by 2016, and an additional US$32.6 billion to instant messaging providers in the same period.

The message was simple to traditional communications carriers – adapt or become irrelevant.  National carriers may try to work with government regulators to try and adopt legal barriers to prevent the emergence of OTTs operating in that country, however that is only a temporary step to stem the flow of “technology-enabled” competition and retain revenues. 

As noted by Nazareno, the carriers must wake up to the reality we are in a global technology refresh cycle and  business visions, expectations, and construct business plans that will not only allow the company to survive, but also meet the needs of their users and national objectives.

Kevin Vachon, MEFMartin Geddes, owner of Martin Geddes Consulting, introduced the idea of “Task Substitution.’”  Task Substitution occurs when an individual or organization is able to use a substitute technology or process to accomplish tasks that were previously only available from a single source.  One example is the traditional telephone call.  In the past you would dial a number, and the telephone company would go through a series of connections, switches, and processes that would both connect two end devices, as well as provide accounting for the call.

The telephone user now has many alternatives to the tradition phone call – all task substitutions.  You can use Skype, WebEx, GoToMeeting, instant messaging – any one of a multitude of utilities allowing an individual or group to participate in one to one or many to many communications.  When a strong list of alternative methods to complete a task exist, then the original method may become obsolete, or have to rapidly adapt to avoid being discarded by users.

A strong message, which made many attendees visibly uncomfortable.

Ivan Landen, Managing Director at Asia-Pacific Expereo, described the telecom revolution in terms all attendees could easily visualize.  “Today around 80% of the world’s population have access to the electrical grid/s, while more than 85% of the population has access to Wifi.”

Ivan Landen, ExpereoHe also provided an additional bit of information which did not surprise attendees, but also made some of the telecom representatives a bit uneasy.  In a survey Geddes conducted he discovered that more than 1/2 of business executives polled admitted their Internet access was better at their homes than in their offices.”  This information can be analyzed in several different ways, from having poor IT planning with the company, to poor UT capacity management within the communication provider, to the  reality traffic on consumer networks is simply lower during the business day than during other time periods.

However the main message was “there is a huge opportunity for communication companies to fix business communications.”

The conference continues until Friday.  Many more sessions, many more perimeter discussions, and a lot of space for the telecom community to come to grips with the reality “we need to come to grips with the digital world.”

Tagged with:
 

Cloud Computing ClassroomNormally, when we think of technical-related training, images of rooms loaded with switches, routers, and servers might come to mind.    Cloud computing is different.  In reality, cloud computing is not a technology, but rather a framework employing a variety of technologies – most notably virtualization, to solve business problems or enable opportunities.

From our own practice, the majority of cloud training students represent non-technical careers and positions. Our training does follow the CompTIA Cloud Essentials course criterion, and is not a technical course, so the non-technical student trend should not come as any big surprise. 

What does come as a surprise is how enthusiastically our students dig into the topic.  Whether business unit managers, accounting and finance, sales staff, or executives, all students come into class convinced they need to know about cloud computing as an essential part of their future career progression, or even at times to ensure their career survival.

Our local training methodology is based on establishing an indepth knowledge of the NIST Cloud Definitions and Cloud Reference Architecture.  Once the students get beyond a perception such documents are too complex, and that we will refer nearly all aspects of training to both documents, we easily establish a core cloud computing knowledge base needed to explore both technical aspects, and more importantly practical aspects of how cloud computing is used in our daily lives, and likely future lives.

This is not significantly different than when we trained business users on how to use, employ, and exploit  the Internet in the 90s.  Those of us in engineering or technical operations roles viewed this type of training with either amusement or contempt, at times mocking those who did not share our knowledge and experience of internetworking, and ability to navigate the Internet universe.

We are in the same phase of absorbing and developing tacit knowledge of compute and storage access on demand, service-oriented architectures, Software as a Service, the move to a subscription-based application world.

Hamster Food as a Service (HFaaS)Those students who attend cloud computing training leave the class better able to engage in decision-making related to both personal and organizational information and communication technology, and less exposed to the spectrum of cloud washing, or marketing use of “cloud” and “XXX as a Service”  language overwhelming nearly all media on subjects ranging from hamster food to SpaceX and hyper loops.

Even the hardest core engineers who have degraded themselves to join a non-technical business-oriented cloud course walk away with a better view on how their tools support organizational agility (good jargon, no?), in addition to the potential financial impacts, reduced application development cycles, disaster recovery, business continuity, and all the other potential benefits to the organization when adopting cloud computing.

Some even walk away from the course planning a breakup with some of their favorite physical servers.

The Bottom Line

No student has walked away from a cloud computing course knowing less about the role, impact, and potential of implementing cloud in nearly any organization.  While the first few hours of class embrace a lot of great debates on the value of cloud computing, by the end of the course most students agree they are better prepared to consider, envision, evaluate, and address the potential or shortfalls of cloud computing.

Cloud computing is, and will continue to have influence on many aspects of our lives. It is not going away anytime soon.  The more we can learn, either through self-study or resident training, the better position we’ll be in to make intelligent decisions regarding the use and value of cloud in our lives and organizations.

Seattle Washington - Home of WBXInternational telecommunication carriers all share one thing in common – the need to connect with other carriers and networks.  We want to make calls to China, a video conference in Moldova, send an email message for delivery within 5 seconds to Australia – all possible with our current state of global communications.  Magic?  Of course not.  While an abstract to most, the reality is telecommunications physical infrastructure extends to nearly every corner of the world, and communications carriers bring this global infrastructure together at  a small number of facilities strategically placed around the world informally called “carrier hotels.”

Pacific-Tier had the opportunity to visit the Westin Building Exchange (commonly known as the WBX), one of the world’s busiest carrier hotels, in early August.   Located in the heart of Seattle’s bustling business district, the WBX stands tall at 34 stories.  The building also acts as a crossroads of the Northwest US long distance terrestrial cable infrastructure, and is adjacent to trans-Pacific submarine cable landing points.

The world’s telecommunications community needs carrier hotels to interconnect their physical and value added networks, and the WBX is doing a great job in facilitating both physical interconnections between their more than 150 carrier tenants.

“We understand the needs of our carrier and network tenants” explained Mike Rushing,   Business Development Manager at the Westin Building.  “In the Internet economy things happen at the speed of light.  Carriers at the WBX are under constant pressure to deliver services to their customers, and we simply want to make this part of the process (facilitating interconnections) as easy as possible for them.”

Main Distribution Frame at WBXThe WBX community is not limited to carriers.  The community has evolved to support Internet Service Providers, Content Delivery Networks (CDNs), cloud computing companies, academic and research networks, enterprise customers, public colocation and data center operators, the NorthWest GigaPOP, and even the Seattle Internet Exchange Point (SIX), one of the largest Internet exchanges in the world.

“Westin is a large community system,” continued Rushing.  “As new carriers establish a point of presence within the building, and begin connecting to others within the tenant and accessible community, then the value of the WBX community just continues to grow.”

The core of the WBX is the 19th floor meet-me-room (MMR).  The MMR is a large, neutral, interconnection point for networks and carriers representing both US and international companies.  For example, if China Telecom needs to connect a customer’s headquarters in Beijing to an office in Boise served by AT&T, the actual circuit must transfer at a physical demarcation point from China Telecom  to AT&T.  There is a good chance that physical connection will occur at the WBX.

According to Kyle Peters, General Manager of the Westin Building, “we are supporting a wide range of international and US communications providers and carriers.  We fully understand the role our facility plays in supporting not only our customer’s business requirements, but also the role we play in supporting global communications infrastructure.”

You would be correct in assuming the WBX plays an important role in that critical US and global communications infrastructure.  Thus you would further expect the WBX to be constructed and operated in a manner providing a high level of confidence to the community their installed systems will not fail.

Lance Forgey, Director of Operations at the WBX, manages not only the MMR, but also the massive mechanical (air conditioning) and electrical distribution systems within the building.  A former submarine engineer, Forgey runs the Westin Building much like he operated critical systems within Navy ships.  Assisted by an experienced team of former US Navy engineers and US Marines, the facility presents an image of security, order, cleanliness, and operational attention to detail.

“Our operations and facility staff bring the discipline of many years in the military, adding innovation needed to keep up with our customer’s industries” said Forgey.  “Once you have developed a culture of no compromise on quality, then it is easy keep things running.”

That is very apparent when you walk through the site – everything is in its place, it is remarkably clean, and it is very obvious the entire site is the product of a well-prepared plan.

WBX GeneratorsOne area which stands out at the WBX is the cooling and electrical distribution infrastructure.  With space within adjacent external parking structures and additional areas outside of the building most heavy equipment is located outside of the building, providing an additional layer of physical security, and allowing the WBX to recover as much space within the building as possible for customer use.

“Power is not an issue for us”  noted Forgey.  “It is a limiting factor for much of our industry, however at the Westin Building we have plenty, and can add additional power anytime the need arises.”

That is another attraction for the WBX versus some of the other carrier hotels on the West Coast of the US.  Power in Washington State averages around $0.04/kWH, while power in California may be nearly three times as expensive.

“In addition to having all the interconnection benefits similar operations have on the West Coast, the WBX can also significantly lower operating costs for tenants” added Rushing.  As the cost of power is a major factor in data center operations, reducing the cost of operations through a significant reduction in the cost of power is a big issue.

The final area carrier hotels need to address is the ever changing nature of communications, including interconnections between members of the WBX community.  Nothing is static, and the WBX team is constantly communicating with tenants, evaluating changes in supporting technologies, and looking for ways to ensure they have the tools available to meet their rapidly changing environments.

Cloud computing, software-defined networking, carrier Ethernet – all  topics which require frequent communication with tenants to gain insight into their visions, concerns, and plans.  The WBX staff showed great interest in cooperating with their tenants to ensure the WBX will not impede development or implementation of new  technologies, as well as attempt to stay ahead of their customer deployments.

“If a customer comes to us and tells us they need a new support infrastructure or framework with very little lead time, then we may not be able to respond quickly enough to meet their requirements” concluded Rushing.  “Much better to keep an open dialog with customers and become part of their team.”

Pacific-Tier has visited, and evaluated dozens of data centers during the past four years.  Some have been very good, some have been very bad.  Some have gone over the edge in data center deployments, chasing the “grail” of a Tier IV data center certification, while some have been little more than a server closet.

The Westin Building (WBX)The Westin Building / WBX is unique in the industry.  Owned by both Clise Properties of Seattle and Digital Realty Trust,  the Westin Building brings the best of both the real estate world and data centers into a single operation.  The quality of mechanical and electrical infrastructure, the people maintaining the infrastructure, and the vision of the company give a visitor an impression that not only is the WBX a world-class facility, but also that all staff and management know their business, enjoy the business, and put their customers on top as their highest priority.

As Clise Properties owns much of the surrounding land, the WBX has plenty of opportunity to grow as the business expands and changes.  “We know cloud computing companies will need to locate close to the interconnection points, so we better be prepared to deliver additional high-density infrastructure as their needs arise” said Peters.  And in fact Clise has already started planning for their second colocation building.  This building, like its predecessor, will be fully interconnected with the Westin Building, including virtualizing the MMR distribution frames in each building into a single cross interconnection environment.

Westin WBX LogoWBX offers the global telecom industry an alternative to other carrier hotels in Los Angeles and San Francisco. One shortfall in the global telecom industry are the “single threaded” links many have with other carriers in the global community.  California has the majority of North America / Asia carrier interconnections today, but all note California is one of the world’s higher risk options for building critical infrastructure, with the reality it is more a matter of “when” than “if” a catastrophic event such as an earthquake occurs which could seriously disrupt international communications passing through one of the region’s MMRs.

The telecom industry needs to have the option of alternate paths of communications and interconnection points.  While the WBX stands tall on its own as a carrier hotel and interconnection site, it is also the best alternative and diverse landing point for trans-Pacific submarine cable capacity – and subsequent interconnections.

The WBX offers a wide range of customer services, including:

  • Engineering support
  • 24×7 Remote hands
  • Fast turn around for interconnections
  • Colocation
  • Power circuit monitoring and management
  • Private suites and lease space for larger companies
  • 24×7 security monitoring and access control

Check out the Westin Building and WBX the next time you are in Seattle, or if you want to learn more about the telecom community revolving and evolving in the Seattle area.  Contact Mike Rushing at mrushing@westinbldg.com for more information.

 

Tagged with:
 

ByeBye-Telephones You are No Longer RequiredJust finished another frustrating day of consulting with an organization that is convinced technology is going to solve their problems.  Have an opportunity?  Throw money and computers at the opportunity.  Have a technology answer to your process problems?  Really?.

The business world is changing.  With cloud computing potentially eliminating the need for some current IT roles, such as physical server huggers…, information technology professionals, or more appropriately information and communications technology (ICT) professionals, need to rethink their roles within organizations.

Is it acceptable to simply be a technology specialist, or do ICT professionals also need to be an inherent part of the business process?  Yes, a rhetorical question, and any negative answer is wrong.  ICT professionals are rapidly being relieved of the burden of data centers, servers (physical servers), and a need to focus on ensuring local copies of MS Office are correctly installed, configured, and have the latest service packs or security patches installed.

You can fight the idea, argue the concept, but in reality cloud computing is here to stay, and will only become more important in both the business and financial planning of future organizations.

Now those copies of MS Office are hosted on MS 365 or Google Docs, and your business users are telling you either quickly meet their needs or they will simply bypass the IT organization and use an external or hosted Software as a Service (SaaS) application – in spite of your existing mature organization and policies.

So what is this TOGAF stuff?  Why do we care?

Well…

As it should be, ICT is firmly being set in the organization as a tool to meet business objectives.  We no longer have to consider the limitations or “needs” of IT when developing business strategies and opportunities.  SaaS and Platform as a Service (PaaS) tools are becoming mature, plentiful, and powerful.

Argue the point, fight the concept, but if an organization isn’t at least considering a requirement for data and systems interoperability, the use of large data sets, and implementation of a service-oriented architecture (SOA) they will not be competitive or effective in the next generation of business.

TOGAF, which is “The Open Group Architecture Framework,” brings structure to development of ICT as a tool for meeting business requirements.   TOGAF is a tool which will force each stakeholder, including senior management and business unit management, to work with ICT professionals to apply technology in a structured framework that follows the basic:

  • Develop a business vision
  • Determine your “AS-IS” environment
  • Determine your target environment
  • Perform a gap analysis
  • Develop solutions to meet the business requirements and vision, and fill the “gaps” between “AS-IS” and “Target”
  • Implement
  • Measure
  • Improve
  • Re-iterate
    Of course TOGAF is a complex architecture framework, with a lot more stuff involved than the above bullets.  However, the point is ICT must now participate in the business planning process – and really become part of the business, rather than a vendor to the business.
    As a life-long ICT professional, it is easy for me to fall into indulging in tech things.  I enjoy networking, enjoy new gadgets, and enjoy anything related to new technology.  But it was not until about 10 years ago when I started taking a formal, structured approach to understanding enterprise architecture and fully appreciating the value of service-oriented architectures that I felt as if my efforts were really contributing to the success of an organization.
    TOGAF was one course of study that really benefitted my understanding of the value and role IT plays in companies and government organizations.  TOGAF provide both a process, and structure to business planning.
    You may have a few committed DevOps evangelists who disagree with the structure of TOGAF, but in reality once the “guardrails” are in place even DevOps can be fit into the process.  TOGAF, and other frameworks are not intended to stifle innovation – just encourage that innovation to meet the goals of an organization, not the goals of the innovators.
    While just one of several candidate enterprise architecture frameworks (including the US Federal Enterprise Architecture Framework/FEAF, Dept. of Defense Architecture Framework /DoDAF), TOGAF is now universally accepted, and accompanying certifications are well understood within government and enterprise.

What’s an IT Guy to Do?

    Now we can send the “iterative” process back to the ICT guy’s viewpoint.  Much like telecom engineers who operated DMS 250s, 300s, and 500s, the existing IT and ICT professional corps will need to accept the reality they will either need to accept the concept of cloud computing, or hope they are close to retirement.  Who needs a DMS250 engineer in a world of soft switches?  Who needs a server manager in a world of Infrastructure as a Service?  Unless of course you work as an infrastructure technician at a cloud service provider…
    Ditto for those who specialize in maintaining copies of MS Office and a local MS Exchange server.  Sadly, your time is limited, and quickly running out.  Either become a cloud computing expert, in some field within cloud computing’s broad umbrella of components, or plan to be part of the business process.  To be effective as a member of the organization’s business team, you will need skills beyond IT – you will need to understand how ICT is used to meet business needs, and the impact of a rapidly evolving toolkit offered by all strata of the cloud stack.

Even better, become a leader in the business process.  If you can navigate your way through a TOGAF course and certification, you will acquire a much deeper appreciation for how ICT tools and resources could, and likely should, be planned and employed within an organization to contribute to the success of any individual project, or the re-engineering of ICTs within the entire organization.


John Savageau is TOGAF 9.1 Certified

Tagged with:
 

Big data.  There are few conversations in the IT community which do not start, address, or end on the topic.  Some conversations are visionary in nature, some critical, and many considering the challenges we’ll need to overcome in the process of understanding how to deal with big data.

BigData2Definitions of big data almost parallel the 3 V’s of big data, which are velocity, volume, and variety.  We may call it a byproduct of massive data growth, cheap technology, and our ability to save everything.  We may address the enormous volumes of data generated by social media, smart grids, and other online transactions. 

However we also need to compress those ideas into a form we can understand and use as a basis for our discussions.  For simplicity, we’ll use Gartner’s definition that “big data are high volume, high velocity, and/or high variety information assets that require new forms of processing to enable enhanced decision-making, insight discovery and process optimization.”

Enhanced decision-making?  Insight discovery?  Process optimization?  Outstanding opportunities to enhance our decision support process.  Just corral all the data available to us, and voila, we’re competitive on a global scale.

One minor snag – we’ll need to recruit skilled technical and business staff with both the skills and business experience needed to effectively exploit the potential of big data, as well as establish value to business, government, and quality of life.

In a well-quoted study by the McKinsey Global Institute (MGI), data is presented supporting the idea big data to ultimately become a key factor in competition, and competitiveness across all public and private sectors.  The study also states that just in the United States we will have a shortfall of nearly 190,000 skilled professionals in deep analytics, coders, and mathematical abilities.

The Australian Workspend Institute acknowledges there is a shortage of talent, and that shortage is getting worse. Stating that the Australian mining, oil, and gas industries have identified a shortfall of 100,000 skilled engineers, the “war for talent” is further running up the cost of labor for staff with deep analytical skills across-the-board.

Boston company NewVantage Partners published the results of an executive survey of leading Fortune 1000 companies adding only 2% of those listed companies felt they had no challenges finding talent and skilled resources needed to understand and exploit big data.

Is it really that important to be on top of big data?

A 29 March 2012 press release from the Executive Office of the President (US) unveiling the country’s “Big Data Research and Development  Initiative” stated “by improving our ability to extract knowledge and insights from large and complex collections of digital data, the initiative promises to solve some of the nation’s most pressing problems.”

Perhaps crime?  BBC Horizon’s aired a special on big data in early 2013, highlighting the efforts of Los Angeles Police Department’s (LAPD) use of big data and analytics in predicting crime.  Collecting crime data from over 30 million records in LA, demographics, geospatial, and other historical data from both LA and around the world, LAPD uses analytics to predict down to 300sqft when and where a crime will likely occur.  With uncanny, almost creepy results.

Health information?  Climate change?  Mining? Earthquakes? Planets which may support human life?

Given the amount of data available, data being created, and the ability of intelligent data scientists to create models of the data value, most believe those who can harvest big data will gain significant advantage.

So what do we do, give up on big data and focus on other activities?

In a recent Forbes article on big data author Ben Woo from Neuralytix wrote ”when it comes to big data, we believe if you’re not doing it, your competitors are!

The United States still holds a significant advantage, leading the world in individuals with skill and experience in deep analytics.  According to the MGI the US, China, India, and Russia lead the world in gross numbers of capable data scientists, while Poland and Romania are graduating the highest numbers of students skilled in deep analytics.

As noted, as the world continues to embrace the science of data, those skilled individuals will continue in demand.  CIO Magazine notes that “CIOs are competing for workers with strong math skills, proficiency working with massive data bases, and emerging database technologies, in addition to workers with expertise in search, data, integration, and business knowledge.”

Those skills CIO believe are the most difficult to find and recruit include:

  • Advanced analytics and predictive analysis skills
  • Complex event processing skills
  • Rule management skills
  • Business intelligence (BI) skills
  • Data integration skills

David Foote, Foote Partner’s Chief Research Officer writes that “many colleges and universities haven’t yet risen to the challenge of teaching the skills that are potentially needed for analytics jobs. 

Industry may be rising to the need, partnering with universities to form the National Consortium for Data Science (NCDS).  According to a paper done by the University of North Carolina Kenan-Flagler Business School (UNC), the agenda for NCDS is to better align the university programs with the needs of private sector (and government).

UNC further advises 4 steps human resources and talent management organizations can take to bridge skills and talent shortfalls in deep data and big data analytics include:

  1. Educating themselves about big data.  If the organization does not understand big data at an operational level, it is unlikely they will be able to recruit skilled data scientists who do understand the concept.
  2. Educate managers and senior leaders about big data.
  3. Develop creative strategies to recruit and retain big data talent.
  4. Offer plans and solutions on how to build the talent in-house.

NOTE:  As a particularly positive note for Southern California readers, California State University – Long Beach (Long Beach State ) was identified in UNC’s report as one of the few schools in the world with a strong , focused deep data analytics program.

While there is no immediate solution to the global shortfall in big data talent, we are finally awakening to the need.  The academic community will need to consider, design, and implement curriculum and programs to acknowledge the need for graduates capable of dealing with big data.  This means back to the basic of mathematics, statistics, analytics, and other disciplines which will graduate capable data scientists.

The Internet, computers, and ability to collect data has changed our world, and created many difficult challenges in our ability to understand the opportunities.  Time to step up to the challenge.

Tagged with:
 

A good indication any new technology or business model is starting to mature is the number of certifications popping up related to that product, framework, or service.   Cloud computing is certainly no exception, with vendors such as Microsoft, Google, VMWare, Cloud Computing Certificationsand IBM offering certification training for their own products, as well as organizations such CompTIA and Architura competing for industry neutral certifications.

Is this all hype, or is it an essential part of the emerging cloud computing ecosystem?  Can we remember the days when entry level Cisco, Microsoft, or other vendor certifications were almost mocked by industry elitists?

Much like the early Internet days of eEverything, cloud computing is at the point where most have heard the term, few understand the concepts, and marketing folk are exploiting every possible combination of the words to place their products in a favorable, forward leaning light.

So, what if executive management takes a basic course in cloud computing principles, or sales and customer service people take a Cloud 101 course?  Is that bad?

Of course not.  Cloud computing has the potential of being transformational to business, governments, organization, and even individuals.  Business leaders need to understand the potential and impact of what a service-oriented cloud computing infrastructure might mean to their organization, the game-changing potential of integration and interoperability, the freedom of mobility, and the practical execution of basic cloud computing characteristics within their ICT environment.

A certification is not all about getting the test, and certificate.  As an instructor for the CompTIA course, I manage classes of 20 or more students ranging from engineers, to network operations center staff, to customer service and sales, to mid-level executives.  We’ve yet to encounter an individual who claims they have learned nothing from attending the course, and most leave the course with a very different viewpoint of cloud computing than held prior to the class.

As with most technology driven topics, cloud computing does break into different branches – including technical, operations, and business utility.

The underlying technologies of cloud computing are probably the easiest part of the challenge, as ultimately skills will develop based on time, experience, and operation of cloud-related technologies.

The more difficult challenge is understanding the impact of cloud computing may mean to an organization, both internally as well as on a global scale.  No business-related discussion of cloud computing is complete without consideration of service-oriented architectures, enterprise architectures, interoperability, big data, disaster management, and continuity of operations.

Business decisions on data center consolidation, ICT outsourcing, and other aspects of the current technology refresh or financial consideration will be more effective and structured when accompanied by a basic business and high level understanding of cloud computing underlying technologies.  As an approach to business transformation, additional complimentary capabilities in enterprise architecture, service-oriented architectures, and IT service management will certainly help senior decision makers best understand the relationship between cloud computing and their organizational planning.

While reading the news, clipping stories, and self-study may help decision makers understand the basic components of cloud computing and other supporting technologies. Taking an introduction cloud computing course, regardless if vendor training or neutral, will give enough background knowledge to at least engage in the conversation. Given the hype surrounding cloud computing, and the potential long term consequences of making an uniformed decision, the investment in cloud computing training must be considered valuable at all levels of the organization, from technical to senior management.

ICT ModernizationThe current technology refresh cycle presents many opportunities, and challenges to both organizations and governments.  The potential of service-oriented architectures, interoperability, collaboration, and continuity of operations is an attractive outcome of technologies and business models available today.  The challenges are more related to business processes and human factors, both of which require organizational transformations to take best advantage of the collaborative environments enabled through use of cloud computing and access to broadband communications.

Gaining the most benefit from planning an interoperable environment for governments and organizations may be facilitated through use of business tools such as cloud computing.  Cloud computing and underlying technologies may create an operational environment supporting many strategic objectives being considered within government and private sector organizations.

Reaching target architectures and capabilities is not a single action, and will require a clear understanding of current “as-is” baseline capabilities, target requirements, the gaps or capabilities need to reach the target, and establishing a clear transitional plan to bring the organization from a starting “as-is” baseline to the target goal.

To most effectively reach that goal requires an understanding of the various contributing components within the transformational ecosystem.  In addition, planners must keep in mind the goal is not implementation of technologies, but rather consideration of technologies as needed to facilitate business and operations process visions and goals.

Interoperability and Enterprise Architecture

Information technology, particularly communications-enabled technology has enhanced business process, education, and the quality of life for millions around the world.  However, traditionally ICT has created silos of information which is rarely integrated or interoperable with other data systems or sources.

As the science of enterprise architecture development and modeling, service-oriented architectures, and interoperability frameworks continue to force the issue of data integration and reuse, ICT developers are looking to reinforce open standards allowing publication of external interfaces and application programming interfaces.

Cloud computing, a rapidly maturing framework for virtualization, standardized data, application, and interface structure technologies, offers a wealth of tools to support development of both integrated and interoperable ICT  resources within organizations, as well as among their trading, shared, or collaborative workflow community.

The Institute for Enterprise Architecture Development defines enterprise architecture (EA) as a “complete expression of the enterprise; a master plan which acts as a collaboration force between aspects of business planning such as goals, visions, strategies and governance principles; aspects of business operations such as business terms, organization structures, processes and data; aspects of automation such as information systems and databases; and the enabling technological infrastructure of the business such as computers, operating systems and networks”

ICT, including utilities such as cloud computing, should focus on supporting the holistic objectives of organizations implementing an EA.  Non-interoperable or shared data will generally have less value than reusable data, and will greatly increase systems reliability and data integrity.

Business Continuity and Disaster Recovery (BCDR)

Recent surveys of governments around the world indicate in most cases limited or no disaster management or continuity of operations planning.  The risk of losing critical national data resources due to natural or man-made disasters is high, and the ability for most governments maintain government and citizen services during a disaster is limited based on the amount of time (recovery time objective/RTO) required to restart government services, as well as the point of data restoral (recovery point objective /RPO).

In existing ICT environments, particularly those with organizational and data resource silos,  RTOs and RPOs can be extended to near indefinite if both a data backup plan, as well as systems and service restoral resource capacity is not present.  This is particularly acute if the processing environment includes legacy mainframe computer applications which do not have a mirrored recovery capacity available upon failure or loss of service due to disaster.

Cloud computing can provide a standards-based environment that fully supports near zero RTO/RPO requirements.  With the current limitation of cloud computing being based on Intel-compatible architectures, nearly any existing application or data source can be migrated into a virtual resource pool.   Once within the cloud computing Infrastructure as a Service (IaaS) environment, setting up distributed processing or backup capacity is relatively uncomplicated, assuming the environment has adequate broadband access to the end user and between processing facilities.

Cloud computing-enabled BCDR also opens opportunities for developing either PPPs, or considering the potential of outsourcing into public or commercially operated cloud computing compute, storage, and communications infrastructure.  Again, the main limitation being the requirement for portability between systems.

Transformation Readiness

ICT modernization will drive change within all organizations.  Transformational readiness is not a matter of technology, but a combination of factors including rapidly changing business models, the need for many-to-many real-time communications, flattening of organizational structures, and the continued entry of technology and communications savvy employees into the workforce.

The potential of outsourcing utility compute, storage, application, and communications will eliminate the need for much physical infrastructure, such as redundant or obsolete data centers and server closets.  Roles will change based on the expected shift from physical data centers and ICT support hardware to virtual models based on subscriptions and catalogs of reusable application and process artifacts.

A business model for accomplishing ICT modernization includes cloud computing, which relies on technologies such as server and storage resource virtualization, adding operational characteristics including on-demand resource provisioning to reduce the time needed to procure ICT resources needed to respond to emerging operational  or other business opportunities.

IT management and service operations move from a workstation environment to a user interface driven by SaaS.  The skills needed to drive ICT within the organization will need to change, becoming closer to the business, while reducing the need to manage complex individual workstations.

IT organizations will need to change, as organizations may elect to outsource most or all of their underlying physical data center resources to a cloud service provider, either in a public or private environment.  This could eliminate the need for some positions, while driving new staffing requirements in skills related to cloud resource provisioning, management, and development.

Business unit managers may be able to take advantage of other aspects of cloud computing, including access to on-demand compute, storage, and applications development resources.  This may increase their ability to quickly respond to rapidly changing market conditions and other emerging opportunities.   Business unit managers, product developers, and sales teams will need to become familiar with their new ICT support tools.  All positions from project managers to sales support will need to quickly acquire skills necessary to take advantage of these new tools.

The Role of Cloud Computing

Cloud computing is a business representation of a large number of underlying technologies.  Including virtualization, development environment, and hosted applications, cloud computing provides a framework for developing standardized service models, deployment models, and service delivery characteristics.

The US National Institute of Standards and Technology (NIST) provides a definition of cloud computing accepted throughout the ICT industry.

“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.“

While organizations face decisions related to implementing challenges related to developing enterprise architectures and interoperability, cloud computing continues to rapidly develop as an environment with a rich set of compute, communication, development, standardization, and collaboration tools needed to meet organizational objectives.

Data security, including privacy, is different within a cloud computing environment, as the potential for data sharing is expanded among both internal and potentially external agencies.  Security concerns are expanded when questions of infrastructure multi-tenancy, network access to hosted applications (Software as a Service / SaaS), and governance of authentication and authorization raise questions on end user trust of the cloud provider.

A move to cloud computing is often associated with data center consolidation initiatives within both governments and large organizations.  Cloud delivery models, including Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) support the development of virtual data centers.

While it is clear long term target architectures for most organizations will be an environment with a single data system, in the short term it may be more important to decommission high risk server closets and unmanaged servers into a centralized, well-managed data center environment offering on-demand access to compute, storage, and network resources – as well as BCDR options.

Even at the most basic level of considering IaaS and PaaS as a replacement environment to physical infrastructure, the benefits to the organization may become quickly apparent.  If the organization establishes a “cloud first” policy to force consolidation of inefficient or high risk ICT resources, and that environment further aligns the organization through the use of standardized IT components, the ultimate goal of reaching interoperability or some level of data integration will become much easier, and in fact a natural evolution.

Nearly all major ICT-related hardware and software companies are re-engineering their product development to either drive cloud computing, or be cloud-aware.  Microsoft has released their Office 365 suite of online and hosted environments, as has Google with both PaaS and SaaS tools such as the Google Apps Engine and Google Docs.

The benefits of organizations considering a move to hosted environments, such as MS 365, are based on access to a rich set of applications and resources available on-demand, using a subscription model – rather than licensing model, offering a high level of standardization to developers and applications.

Users comfortable with standard office automation and productivity tools will find the same features in a SaaS environment, while still being relieved of individual software license costs, application maintenance, or potential loss of resources due to equipment failure or theft.  Hosted applications also allow a persistent state, collaborative real-time environment for multi-users requiring access to documents or projects.  Document management and single source data available for reuse by applications and other users, reporting, and performance management becomes routine, reducing the potential and threat of data corruption.

The shortfalls, particularly for governments, is that using a large commercial cloud infrastructure and service provider such as Microsoft  may require physically storing data in location outside of their home country, as well as forcing data into a multi-tenant environment which may not meet security requirements for organizations.

Cloud computing offers an additional major feature at the SaaS level that will benefit nearly all organizations transitioning to a mobile workforce.  SaaS by definition is platform independent.  Users access SaaS applications and underlying data via any device offering a network connection, and allowing access to an Internet-connected address through a browser.    The actual intelligence in an application is at the server or virtual server, and the user device is simply a dumb terminal displaying a portal, access point, or the results of a query or application executed through a command at the user screen.

Cloud computing continues to develop as a framework and toolset for meeting business objectives.  Cloud computing is well-suited to respond to rapidly changing business and organizational needs, as the characteristics of on-demand access to infrastructure resources, rapid elasticity, or the ability to provision and de-provision resources as needed to meet processing and storage demand, and organization’s ability to measure cloud computing resource use for internal and external accounting mark a major change in how an organization budgets ICT.

As cloud computing matures, each organization entering a technology refresh cycle must ask the question “are we in the technology business, or should we concentrate our efforts and budget in efforts directly supporting realizing objectives?”  If the answer is the latter, then any organization should evaluate outsourcing their ICT infrastructure to an internal or commercial cloud service provider.

It should be noted that today most cloud computing IaaS service platforms will not support migration of mainframe applications, such as those written for a RISC processor.  Those application require redevelopment to operate within an Intel-compatible processing environment.

Broadband Factor

Cloud computing components are currently implemented over an Internet Protocol network.  Users accessing SaaS application will need to have network access to connect with applications and data.  Depending on the amount of graphics information transmitted from the host to an individual user access terminal, poor bandwidth or lack of broadband could result in an unsatisfactory experience.

In addition, BCDR requires the transfer of potentially large amounts of data between primary and backup locations. Depending on the data parsing plan, whether mirroring data, partial backups, full backups, or live load balancing, data transfer between sites could be restricted if sufficient bandwidth is not available between sites.

Cloud computing is dependent on broadband as a means of connecting users to resources, and data transfer between sites.  Any organization considering implementing cloud computing outside of an organization local area network will need to fully understand what shortfalls or limitations may result in the cloud implementation not meeting objectives.

The Service-Oriented Cloud Computing Infrastructure (SOCCI)

Governments and other organizations are entering a technology refresh cycle based on existing ICT hardware and software infrastructure hitting the end of life.  In addition, as the world aggressively continues to break down national and technical borders, the need for organizations to reconsider the creation, use, and management of data supporting both mission critical business processes, as well as decision support systems will drive change.

Given the clear direction industry is taking to embrace cloud computing services, as well as the awareness existing siloed data structures within many organizations would better serve the organization in a service-oriented  framework, it makes sense to consider an integrated approach.

A SOCCI considers both, adding reference models and frameworks which will also add enterprise architecture models such as TOGAF to ultimately provide a broad, mature framework to support business managers and IT managers in their technology and business refresh planning process.

SOCCIs promote the use of architectural building blocks, publication of external interfaces for each application or data source developed, single source data, reuse of data and standardized application building block, as well as development and use of enterprise service buses to promote further integration and interoperability of data.

A SOCCI will look at elements of cloud computing, such as virtualized and on-demand compute/storage resources, and access to broadband communications – including security, encryption, switching, routing, and access as a utility.  The utility is always available to the organization for use and exploitation.  Higher level cloud components including PaaS and SaaS add value, in addition to higher level entry points to develop the ICT tools needed to meet the overall enterprise architecture and service-orientation needed to meet organizational needs.

According to the Open Group a SOCCI framework provides the foundation for connecting a service-oriented infrastructure with the utility of cloud computing.  As enterprise architecture and interoperability frameworks continue to gain in value and importance to organizations, this framework will provide additional leverage to make best use of available ICT tools.

The Bottom Line on ICT Modernization

The Internet Has reached nearly every point in the world, providing a global community functioning within an always available, real-time communications infrastructure.  University and primary school graduates are entering the workforce with social media, SaaS, collaboration, and location transparent peer communities diffused in their tacit knowledge and experience.

This environment has greatly flattened any leverage formerly developed countries, or large monopoly companies have enjoyed during the past several technology and market cycles.

An organization based on non-interoperable or standardized data, and no BCDR protection will certainly risk losing a competitive edge in a world being created by technology and data aware challengers.

Given the urgency organizations face to address data security, continuity of operations, agility to respond to market conditions, and operational costs associated with traditional ICT infrastructure, many are looking to emerging technology frameworks such as cloud computing to provide a model for planning solutions to those challenges.

Cloud computing and enterprise architecture frameworks provide guidance and a set of tools to assist organizations in providing structure, and infrastructure needed to accomplish ICT modernization objectives.

Tagged with:
 

We all know the buildings, One Wilshire, The Westin Building, 60 Hudson, Telehouse (UK), 200 Paul, 1102 Grand – all buildings advertising dozens, or even hundreds of carriers using the properties for interconnections at the fiber and network level.  Meet-me-rooms are crowded, ladder racks full, and each property sits in the middle of the central business district in large cities.

Splincing FiberAt one point, rumors circled the industry that if One Wilshire had a catastrophic failure of infrastructure that global communications may be  set back to the mid 1960s.  True or not, the building’s meet-me-room supports hundreds of interconnections which are single-threaded connecting carriers in distant countries and continents.

All in buildings designed to house office users.  All buildings with little or no potential for external security.  All buildings with challenges for internal electrical distribution, cooling, and Fiber infrastructure that is not going away anytime soon.

The Internet, and what the internet is likely to evolve in the future, will be a combination of high performance wireless access and fixed access connecting every square centimeter of countries to what we are now calling the Fourth Utility.  The Fourth Utility being a marriage of broadband infrastructure and cloud computing infrastructure.

As a utility, the infrastructure should be envisioned, planned, and implemented as basic infrastructure, with compromise only considered to accommodate exceptions, such as legal limitations or geological limitations.  Where we need to interconnect segments of this infrastructure, as in carrier hotels, the interconnection points should be designed as infrastructure, and not a compromise.

Of course this is not surprising.  Carrier hotels by design evolved from a need to find methods for competitive carriers and networks to directly interconnect without the requirement to use an incumbent or formerly monopoly carrier as a transit point.  During the period of global telecom deregulation in the 1980s and 1990s those carriers scrambled to find common interconnection points near metro and long distance fiber routes.

Locations offering a neutral location in close proximity to major metro, long distance, and transcontinental submarine cable routes were found in the central business districts (CBDs) of Los Angeles, Seattle, Miami, New York, and London.  Most of those locations (exception Miami/NAP of the Americas) did not have the space, nor did carriers have the money, to construct a proper central office-grade facility in the CBD to accommodate the electronic switches and muxes needed to support the carriers.

Thus the most suitable, and available, office building was selected to meet the most basic needs of the carriers.

During the 1990s global telecom deregulation progressed and changes in ownership of submarine cables allowed large numbers of international carriers to establish a presence in the carrier meet-me-rooms (MMRs) in buildings such as One Wilshire.  The MMRs were operated by building landlords, with little or no telecom industry operational experience, resulting in installations which were far below normal telecom industry standards.

While MMRs have improved greatly during the past few years, the reality is we have a tremendous amount of national infrastructure being built into properties not designed for the telecom industry – infrastructure that will continue being more and more essential to our ability both as a nation, and as a member of the global economic and social community.

The United States should view telecom and cloud computing as a utility, critical to the national infrastructure.  Standards that follow the same principles of roads, water, and electrical distribution must be applied to the telecom industry, including carrier hotels and other implementations contributing to the Fourth Utility.

The telecom industry must not accept new MMR or carrier hotel infrastructure that is not a design custom suited to the needs of carriers requiring interconnections.  In addition, no infrastructure can tolerate single points of failure on the backbone.  You cannot control a point of failure at all access points, much like an access road washing out during a flood – however the backbone must have resiliency and redundancy.

Here is a call to action for the telecom industry.  Do not accept, support, contribute to, or participate in infrastructure deployments which do not provide levels of both operational and physical security needed to ensure our critical Fourth Utility of telecom infrastructure needed to protect our national and global interests.

EDITOR’S NOTE:  This article by the author originally appeared in BurbankNBeyond.  This and the interview with Councilman David Gordon, are the final articles in a series researching the introduction of a proposed pet Sales and Breeding Regulations Ordinance that may potentially eliminate the sale of commercially bred dogs and cats in the City of Burbank.

BurbankNBeyond requested interviews with city council members to learn and publish their positions and opinions on the topic, issues, and proposed ordinance.  Vice Mayor Gable-Luddy and Councilman Dr. David Gordon agreed to discuss the issue with BurbankNBeyond, other council members and the Mayor did not respond to requests for interview.

The interview segment with Councilman Gordon is the final in this series.  You can read Councilman Gordon’s interview to gain a different perspective and viewpoint on the issue.

Previous articles in the series include:

_______________________________

Vice Mayor Gabel-Luddy:  The issue is about mill puppies.  Puppies that are purchased from known puppy mills or factory breeding facilities.  There is a critical distinction to make.  Nobody is against adopting or selling puppies from reputable breeders, but I think the community, the residents, through their testimony (at city council meetings), their letters, and their petitions, have made it pretty clear that they are in opposition to perpetuating puppy mills by purchasing puppies from those kinds of breeding facilities.

But they’re not opposed to having puppies.

The community made a very clear point of that during our last council meeting (16 Oct).

BurbankNBeyond:  How did you get involved?

Vice Mayor Gabel-Luddy:  I brought it to council at the first step for consideration by council as a whole.  It was brought to my attention by Burbank residents.  Any council member can introduce an item for additional discussion and possible action.

City staff then brought a report to council for discussion on whether to proceed or not.

BurbankNBeyond:  Within the city, how big a problem is this (puppy mill discussion)?  Is this a big enough problem that it really justifies this much attention and emotion for discussion by the Burbank City Council, as well as the residents of Burbank?

Vice Mayor Gabel-Luddy:  Well the residents of Burbank have really spoken up and they themselves have said they want this kind of business stopped.  And frankly it’s about the humane treatment of, in this case, dogs.

As long as there is demand for puppies from puppy mills there will be factory breeding facilities.  While it is clear they are not (commercial breeders) in Burbank, I think it is pretty clear the community has said we don’t want to be participants in this inhumane practice.

There are no breeding facilities in our community.

BurbankNBeyond:  What impact will this, a single community have in the long term objective in stopping puppy mills?

Vice Mayor Gabel-Luddy:  The Animal Welfare Act is nearly 50 years old.

One of the most impressive things that I think can happen is people at the grass roots level, and that may be an individual neighborhood, or an individual community starts to take action.

It’s by that kind of community by community action that we finally see an effect at the state or national level.  Rather than look at this as a city by city thing, I think, my experience has been that it is extremely grass roots.

People are recognizing that breeding a female over and over and over again with multiple litters on an annual basis, in the conditions that have been documented.

If you look at the materials presented you will see this is factual information.  I think grassroots is good.

BurbankNBeyond:  Should the federal government take a more proactive approach to the issue?

Vice Mayor Gabel-Luddy:  It is always good to approach your legislators, whether local, state, or federal and ask for a strengthening of laws or reinforcement of laws.

I don’t think any of us would argue with that.

The only focus on the City of Burbank is focusing on the community.  I am certain that the direction we (City of Burbank) take on this that it will gain the attention of state legislators.

BurbankNBeyond:  Should I have the right to adopt or acquire any puppy I want, a husky puppy, or a malamute puppy?

Vice Mayor Gabel-Luddy:  There are a variety of sources for acquiring puppies.  And there has never been any problem with buying them from a reputable breeder, or adopting from a rescue, or adopting from a specialty breed rescue.

There is no problem acquiring a puppy from a breeder if you are seeking a particular breed.  There are plenty of reputable breeders who breed those kinds of pure-bred dogs.

What I’ve been impressed with when you go to someone who is a reputable breeder is that first of all they do not “sell” their dogs.  Secondly they interview the people who are considering buying puppies.  And thirdly they make the potential buyer very much aware of the temperament of the dog to see whether or not the owner understands what’s required, and if it’s the right kind of dog.

It’s a much more one on one understanding.  Many of the reputable breeders will require the buyer to spay or neuter the dogs.

BurbankNBeyond:  Do you have a position on puppies bought or sold over the Internet?

Vice Mayor Gabel-Luddy:  There are so many legitimate breeders in Southern California that I’m surprised someone wouldn’t just take the time to see the litter and meet the parents, see what kind of facility they are being raised in, and conclude their purchase in that manner.  That seems like the most humane thing to do.

BurbankNBeyond:  What happens if this ordinance passes?

Vice Mayor Gabel-Luddy:  I think it may result in some business changes for stores which sell mill puppies.  But I don’t think it puts them out of business.  And it doesn’t preclude them from buying puppies from reputable breeders, or acquiring puppies from an adoption or a shelter.

I think it will resolve concerns raised by the community.

I wouldn’t say this is a Best Friends (Animal Society) driven discussion.  If it was a Best Friends driven discussion you wouldn’t have the hundreds of letters, petitions, and letters to the editor.

BurbankNBeyond:  What do Burbank residents need to know about this issue prior to the next council meeting?

Vice Mayor Gabel-Luddy:  It is very important that residents come to Council, or communicate with Council and let them know their position on this (discussion).  It is important that residents continue to educate themselves on what the alternatives are, and it is important that residents speak out.

If residents continue to show up as they have in the past that that they are demonstrating to all of us that they want to put an end to this inhumane kind of treatment to animals.

I think our community should know that our staff reports are coming out usually a few days before the Council meeting, and should be available by next Thursday.

Just like people have weighed in on the Internet, I think it is important to continue the dialog.

During the last community meeting on October 16th there was a preponderance of evidence that Peggy Woods (Pet Emporium) has purchased puppies from puppy mills, that the (puppies) came from Missouri, and that the violations (if you look at the page), that there was a USDA record of violations if you look at the facility that occurred at the time one of the dogs was there (dogs offered for sale at Peggy Woods).

So it seems to me there is a preponderance of evidence that they would have bought (Peggy Woods) from puppy mills.

From my point of view that’s not an ethical business practice.

BurbankNBeyond:  Why do you think Councilman Gordon would be so reluctant to support the issue or community’s position?

Vice Mayor Gabel-Luddy:  After all the presentations were done (at the end of the October meeting) I asked everybody on council, all of my colleagues,  members if the presentation by Ms. Rizzotti’s (Shelly Rizzotti, BurbankCROPS) changed anybody’s mind about how they felt about things.  Because her research was so specific, and from my point of view so impeccable.  And I know Mr. Golonski (Mayor, City of Burbank) answered me, but Councilmember Gordon did not.

So I don’t know.  It seemed to be clearly the community desire to change the business model of that (sales of commercially-bred puppies) that practice here in Burbank.  Which I applaud.

We raised the possibility his son worked at Peggy Woods, and I think he admitted as much.  So I don’t know.

Tagged with:
 

EDITOR’S NOTE:  This interview was originally published by the author at BurbankNBeyond. This and the interview with Vice Mayor Emily Gabel-Luddy, are the final articles in a series researching the introduction of a proposed pet Sales and Breeding Regulations Ordinance that may potentially eliminate the sale of commercially bred dogs and cats in the City of Burbank.

BurbankNBeyond requested interviews with city council members to learn and publish their positions and opinions on the topic, issues, and proposed ordinance.  Vice Mayor Gable-Luddy and Councilman Dr. David Gordon agreed to discuss the issue with BurbankNBeyond, other council members and the Mayor did not respond to requests for interview.

This interview segment with Councilman Gordon is the final in this series.  You can also read Vice Mayor Gable-Luddy’s interview to gain a different perspective and viewpoint on the issue.

Previous articles in the series include:

____________________________

BurbankNBeyond:  can you give us an overview on your position regarding the sale of commercially-bred puppies in Burbank?

Councilman David Gordon:  Well I have done quite a bit of research since this whole issue came up.  As far as I can understand, there is no disagreement with me as far as, in any legal way, to put an end to the so called puppy mills.  And I think we need a good definition of that.

If any breeder, if in the process of breeding these animals involves having (them) subjected to harm, or unsafe conditions, unhealthy conditions, we don’t think any reasonable person would oppose (shutting them down), I wouldn’t oppose that.

The concern I have is the approach being taken, and I can just give a few things that I’ve learned.

First of all if they should ban the pet stores, what it’s going to do is drive the sales of dogs underground.  It does nothing to improve the welfare of pets whatsoever.  It doesn’t impact the puppy mills, whatever the definition is.  Because the non-pet stores that are not regulated will obtain the dogs directly from puppy mills.  There is no obligation to reveal, and no obligation that has any impact on the puppy mills..

So that’s one thing.

The other thing is where the dogs come from.  There are thousands and thousands of dogs being imported into the United States.  I do have some information from various reliable sources that there are literally hundreds of thousands of dogs being imported from Mexico, and other countries for sale in the current retail market.

But when they come in, even if it is from outside of California, they still have to comply with the Polanco Act which calls for various vaccinations and health checks before they are marketed with warranties and guarantees to the purchasers of the dogs.

So the other problem with this, and the reason that is so important the Polanco Act is so protective to the public, is that these imported dogs often – and this is by the CDC (Center for Disease Control) in Atlanta with articles published on this, is that they (imported dogs) carry with them diseases, such as new strains of rabies that are not common in the United States which have basically been eradicated.  And other exotic diseases that aren’t really obvious at the puppy stage.

Although there are laws about importing dogs that haven’t had certain vaccinations in that country – apparently there is a loophole with very young puppies which are often what is most in demand.

So there are some very severe and significant potential health consequences which I wasn’t aware of when the issue first came up which I think need to be considered by the council in passing any ordinance.  Because whatever the situation is now I don’t think anyone would want to make the situation worse in terms of public health, above and beyond pet health and safety.

So that’s something I’ve learned.

Another thing is, you know, I think if there is going to be any regulation as far as where these dogs come from, I don’t understand why, and I don’t see why, any group that provides dogs should be exempted, whether it is a non-profit organization, or a rescue group, any party that is providing dogs, you can’t just put it on puppy mills, you should have to reveal the source of where the dog comes from, and the fact that it has been properly processed through health procedures with vet’s inspections and such prior to putting it out to the public.

I think this is very important, and I am very concerned that this is provided in an environment with no regulation.

There are ways I think, and I have to wait and see what our city attorney has to say, because she has some comments that she was going to provide to us, I have to wait and see what her take is on the legality of various recommendations, but I don’t see why that we couldn’t as a city, couldn’t have a regulation to prevent dogs knowingly being sold from sources that are not properly attending to the animal’s welfare, or have any other ulterior motives for providing unhealthy dogs.

I don’t think there should be a blanket proscription in saying anyone who breeds dogs and sells them is a puppy mill.  I don’t believe that’s the case.

I also don’t believe it’s the case, as has been stated, you can simply adopt the pure bred dogs of your choice.  I don’t buy there are readily adoptable pure bred dogs of the particular type those (people) may want.

I certainly don’t think any decision should be made in a hurried or rushed way, in an emotional way.  I think it is a valid concern that folks have about the welfare of all animals.

The question I think is more what is the best way to do it (protect the welfare of animals).  In my case, as a government official in Burbank.  I’m not worried about what they do in other states, because I can’t control that.

But I do have a say in what happens in Burbank.  So I am interested in the best solution that achieves as much as possible the goal making sure dogs and cats that are sold in Burbank are healthy, from reputable breeders.  We should do that.

I’m on board with that.

BurbankNBeyond:  Given all that’s happening in Burbank, do you believe this topic justifies the amount of time being spent on the issue given all the other activities the Council may be spending time on?

Councilman David Gordon:  No, not at all.  I think that it is of importance, and for some people it may be more important, but there are some very significant matters that are facing the council and the city.  There are public issues, there are safety issues in terms of the police and fire department we are dealing with.  There are budgetary issues, there are infrastructure issues that affect people’s health and safety, we must maintain our streets and our sewers and whatnot, we have to ensure issues with our schools are addressed; there are a number of issues I believe supersede this on the priorities list.

That doesn’t mean this is something that shouldn’t be heard.  The overarching effort that I’ve seen put forward, I would welcome the input from any of these folks on any of the other issues that are affecting the city.  And I haven’t seen that.

Many of these folks that have come and spoke or written, I’ve never interacted with them before.  But judging by how energetic they are with this one particular issue, I’d hope they’d see other city and social issues where we’d welcome their input.

BurbankNBeyond:  What do you think the final decision will look like?

Councilman David Gordon:  I have no idea what the final ordinance is that is going to be proposed.  I would hope that any ordinance will be one that every council member could support.   Not only rational, but that it makes sense in terms of protecting the public across the board.

Not just for a particular group that has a particular idea of the way it should be, but we have to look at the bigger picture.  I would think my mission on the council, my charge of the council I should say, is basically to see what’s in the best in the overall health and safety of the community.

And that takes into consideration what would happen to the imported dogs that are now going through traditional sellers, going through health checks as opposed to totally unregulated black market provision of animals.

So when you talk about what the ordinance will do, I don’t think any ordinance in the city of Burbank is going to stop any problems associated with illicit or inappropriate dog breeding across the country.  It may send a message of some sort, but what the message is, and how effective it is, I really don’t know.

I’ll wait to see what ordinance is presented to the council to consider and act on.

BurbankNBeyond:  What do the residents of Burbank need to know about this, and what do they need to prepare to engage in the discussion?

Councilman David Gordon:  I think the residents of Burbank could somehow be informed that there are proposals being considered by the council that would take away the ability of pet stores to sell cats and dogs other than those which are somehow obtained by adoption.

Their ability to purchase a particular breed by going into a store and selecting it from a display or ordering it will be eliminated.

If they (Burbank residents) are OK with it – fine.  If they are not OK with it, I would hope they would come on down and participate in the dialog I am sure is going to take place at the council this week.

_________________________________

EDITOR’S NOTE:  The Burbank City Council Meeting Agenda for 28 January is posted, and has the following materials available for public review and download:

COUNCIL AGENDA – CITY OF BURBANK

TUESDAY, JANUARY 29, 2013

5:00 P.M. – Council Chamber, 275 E. Olive Avenue

Introduction of Pet Sales and   Breeding Regulations Ordinance – Community Development Department:
Recommendation:
introduce AN ORDINANCE OF THE COUNCIL OF THE CITY OF BURBANK AMENDING TITLE 5   OF THE BURBANK MUNICIPAL CODE TO PROHIBIT THE SALE OF ALL DOGS AND CATS BY A   RETAIL PET STORE.

Item 4 – Staff Report

Item 4 – Exhibit A – Ordinance

Item 4 – Exhibit B – Federal and State Regulations

Item 4 – Exhibit C – Comparison of Local Ordinances

Tagged with: