For those of us old-timers who muscled 9-track tapes on 10 ft tall on Burroughs B-3500 mainframe computers tape drives, with a total storage capacity of about 5 kilobytes, the idea of sticking a 64 gigabyte SD memory chip into my laptop computer is pretty cosmic.
Terms like PCAM (punch card adding machines) are no longer part of the taxonomy of information technology, nor would any young person in the industry comprehend the idea of a disk platter or disk pack.
Skipping a bit ahead, we find a time when you could purchase an IBM “XT” computer with an integrated 10 megabyte hard drive. No more reliance on 5.25″ or later 3.5″ floppy disks. Hard drives evolved to the point “Fryes” will pitch you a USB or home network 1 terabyte drive for about $100.
Enter the SSD
October 2009 brings us to the point hard drives are now becoming a compromise solution. The SSD (Solid State Disk) has jumped on the data center stage. With MySpace’s announcement they are replacing all 1770 of their existing disk drive-based server systems with higher capacity SSDs, and quoted that SSDs use only 1% of the power required by disk drives, data center rules are set to change again.
SSDs are efficient. If you read press releases and marketing material supporting SSD sales you will hear numbers like:
- “…single-server performance levels with 1.5GB/sec. throughput and almost 200,000 IOPS
- … a 320GB ioDrive can fill a 10Gbit/sec. Ethernet pipe
- … four ioDrive Duos in a single server can scale linearly, which provides up to 6GB/sec. of read bandwidth and more than 500,000 read IOPS (Fusion.io)
This means not only are you saving power per server, you are also able to pack a multiple of existing storage capacity into the same space as currently possible with traditional disk systems. As clusters of SSDs become possible through additional tech development of parallel systems, we need to mentally get our heads around the concept of a three dimensional storage system, rather than a linear systems used today.
The concept of RAID and tape backup systems may also become obsolete, as SSDs hold their images when primary power is removed.
Now companies like MySpace will be in a really great position to re-negotiate their data center and colocation deals, as their actual energy and space requirements will potentially be a fraction of existing installations. Even considering their growth potential, the reduction in actual power and space will no doubt give them more leverage to use in the data center agreements.
Why? Data center operators are now planning their unit costs and revenues based on power sales and consumption. If a company like MySpace is able to reduce their power draw by 30% or more, this represents a potentially huge opportunity cost to the data center in space and power sales. Advantage goes to the tenant.
The Economics of SSDs
Today, the cost of SSDs is slightly higher than traditional disk systems. Even with fiber channel or Infiniband supporting large disk (SAN or NAS) installations. According to Yahoo Tech the cost of an SSD is about 4 times that of a traditional disk. However they also indicate that cost is quickly dropping, and we will probably see near parity within the next 3~4 years.
Now, if we remember the claim MySpace made that with the SSD migration they will consume only 1% of the power used by traditional disk (that is only the disk, not the entire chassis or server enclosure). If you look through a great white paper (actually it is called a “Green Paper”) provided by Fusion.io you will see that implementation of their SSD systems in a large disk farm of 250 servers (components include main memory, 4xnet cache, 4x tier 1/2/3 storage, tape storage) you will see a reduction from 146.6kw to 32kw for the site.
Data centers can charge anywhere from $120~$225/kw, showing that we could potentially, if you believe the marketing material, see a savings of $20,000/month @ $180/kw. This would also represent 47 tons of carbon, using the Carbon Footprint Calculator.
Fusion .io reminds us that
“In 2006, U.S. data centers consumed an estimated 61 billion kilowatt-hours (kWh) of energy, which accounted for about 1.5% of the total electricity consumed in the U.S. that year, up from 1.2% in 2005. The total cost of that energy consumption was $4.5 billion, which is more than the electricity consumed by all color televisions in the country and is equivalent to the electricity consumption of about 5.8 million average U.S. households.
• Data centers’ cooling infrastructure accounts for about half of that electricity consumption.
• If current trends continue, by 2011, data centers will consume 100 billion kWh of energy, at a total annual cost of $7.4 billion and would necessitate the construction of 10 additional power plants. (from “Taming the Power Hungry Data Center”)”
When we consider the potential impact of data center consolidation through use of virtualization and cloud computing, and the rapid advancements of SSD technologies and capacities, we may be able to make a huge positive impact by reducing the load Internet, entertainment, content delivery, and enterprise systems will have on our use of electricity – and subsequent impact on the environment.
Of course we need to keep our eyes on the byproducts of technology (e-Waste), and ensure making improvements in one area does not create a nightmare in another part of our environment.
Some Additional Resources
StorageSearch.Com has a great listing of current announcements and articles both following and describing the language of the SSD technology and industry. There is still a fair amount of discussion on the quality and future direction of SSDs, however the future does look very exciting and positive.
For those of us who can still read the Hollerith coding on punch cards, the idea of >1.25TB on and SSD is abstract. But abstract in a fun, exciting way.
How do you feel about the demise of disk? Too soon to consider? Ready to install?
John Savageau, Long Beach
“If everyone purchasing a room air conditioner in 2009 chooses an ENERGY STAR qualified model, it would save 390 million kilowatt-hours of electricity a year. That would prevent more than 600 million pounds of greenhouse gas emissions each year—equivalent to taking more than 50,000 cars off the road—and save consumers over $43 million each year in energy bills.” (Pickens Plan Fact of the Day, 8 Oct 09)
California has always prided itself as being a leader in alternative energy innovation. Driving through the hills around Livermore, Palm Springs, or between San Diego and Yuma bring skylines full of wind turbines. The California Energy Commission claims that wind turbines generated 6,802 gigawatt-hours of electricity – about 2.3 percent of the state’s gross system power. By the end of 2009 California actually expects to hit nearly 5% energy production from renewable sources.
While the wind turbine program has slowed down a bit due to animal rights groups objecting to bird casualties due to propeller strikes, California has not slowed down at all in the state’s aggressive goals for green energy production. While it is probably a bit too aggressive, California’s Energy Commission has set a goal of hitting 20% by the end of 2010 (Senate Bill 107), and 33% by the end of 2020 (Executive Order S-14-08).
The US Congress is shooting for 20% renewable energy production nationwide by 2010 – a far lower threshold than desired in California.
Energy Programs and Incentives in California
Each state has some level of renewable energy initiative supporting energy efficient homes. California’s program falls under the “The California Energy Commission’s New Solar Homes Partnership” (NSHP). This is a great resource not only for existing home owners in California, but also those persons planning to build new structures. The objectives of NSHP’s program include:
In the Home
A solar home with high energy-efficiency features offers homeowners:
- Clean, renewable energy
- Utility bill savings
- Predictable utility costs
- Protection against future rising electricity costs
California is also offering financial incentives to homebuilders to design energy efficiency and the potential of renewable energy planning into the new home. Solar energy “is one of the most significant personal actions one can take to cut air pollution and greenhouse gas emissions, while helping to conserve precious energy resources for future generations. Plus, it reduces the need for costly new power plants” according to the NSHP.
All California homeowners implementing solar panels in their homes also qualify for the federal tax credits up to $2000.
An unscientific jog around the Sunset Canyon Drive area of Burbank on 17 Oct 2009 tallied around 1 of every 5 homes observed supporting some level of solar panel on the property, visible from street level. Using guidelines from the National Renewable Energy Labs (NREL) you will see the average family in the Los Angeles area will save nearly $714 a year with solar panels supplementing their electrical supply.
For us apartment and condo-dwellers, that could almost pay 100% of our energy requirement during normal conditions, if we have a means of storing energy during evening hours and period of bad weather.
Don’t forget our earlier discussions on other simple things such as painting your rooftop white, or using solar reflective material on your roof to reduce the amount of heat in your home during the summer. By the way… you also get a one-time energy credit for that simple task.
“More than 50% of the energy used in a typical American home is for space heating and cooling. Much of that conditioned air escapes through poorly sealed, under-insulated attics. Only 20% of homes built before 1980 are well insulated. Properly sealing and insulating your attic can save you up to 10% annually on energy bills.” (Pickens Plan Fact of the Day7 Oct 2009)
In Commercial Sites
Companies such as the Bank of America (in Riverside, California) have built their facility with solar covering the entire rooftop of the building. Not only do they enjoy a tremendous savings in energy costs, but with a commercial property the BoA will receive a 30% federal construction tax credit, accelerated equipment depreciation, and additional financing support to help defray the cost of installing renewable energy resources.
California will tack on an additional incentive of $1.90 per watt up to a 1Megawatt solar panel system.
All focused on getting us to that 20% milestone in 2010, and the world-leading 33% renewable energy target for 2020.
Some Resources to Look at During Energy Awareness Month (October)
The State of California, California’s energy utilities, and the US Department of Energy have great resources to guide us in meeting our energy awareness and energy planning goals. Here is a partial list, but a great start. The Internet and Google searches will help lead you further.
- The California Energy Commission Home
- California Renewable Energy Handbook
- Go Solar California Home
- California Solar Initiative
- SoCal Edison solar initiative website
- PG&E solar initiative website
- State of California CSI rebate calculator
- US Department of Energy Solar Initiatives
- The Pickens Plan
What Are You Doing?
Share your energy stories with us. What has worked for you? What has failed? Are you an alternative and reneable energy skeptic like Texas’ Governor Rick Perry? Are you an energy leader? Let us know.
John Savageau, Long Beach
Data Center “X” just announced a 2 MegaWatt expansion to their facility in Northern California. A major increase in data center capacity, and a source of great joy for the company. And the source of potentially 714 additional tons of carbon introduced each month into the environment.
Many groups and organizations are gathering to address the need to bring our data centers under control. Some are focused on providing marketing value for their members, most others appear genuinely concerned with the amount of power being consumed within data centers, the amount of carbon being produced by data centers, and the potential for using alternative or clean energy initiatives within data centers. There are stories around which claim the data center industry is actually using up to 5% of power consumed within the United States, which if true, makes this a really important discussion.
If you do a “Bing” search won the topic of “green data center,” you will find around 144 million results. Three times as many as a “paris hilton” search. That makes it a fairly saturated topic, indicating a heck of a lot of interest. The first page of the Bing search gives you a mixture of commercial companies, blogs, and “ezines” covering the topic – as well as an organization or two. Some highlights include:
- 42U (Private Company Pitching Green)
- Green Data Center Blog
- Web Host Industry Review – Green Data Center Info (ezine)
- http://www.greenhousedata.com/ (private advocacy group)
- http://www.thegreengrid.org/ (industry Organization)
With this level of interest you might expect just about everybody in the data center industry to be aggressively implementing “green data center best practices.” Well, not really. In the past month the author (me!) toured not less than six commercial data centers. In every data center I saw major best practices violations, including:
- Large spacing within cabinets forcing hot air recirculation (not using blanking panels, as well as loose PCs and tower servers placed adhoc within a cabinet shelf)
- Failure to use Hot/Cold aisle separation
- High density cabinets using open 4 post racks
- Spacing in high density server areas between cabinets
- Failure to use any level of hot or cold air containment in high density data center spaces, including those with raised floors and drop-ceilings which would support hot air plenums
And other more complicated issues such as not integrating the electrical and environmental data into a building management system.
The Result of Poor Data Center Management
The Uptime Institute developed a metric called Power Utilization Efficiency (PUE) to measure the effectiveness of power usage within a data center. The equation is very simple, the PUE is the total facility powe3r consumption divided by the amount of power actually consumed by either internal IT equipment, or in the case of a public data center customer-facing or revenue-producing energy consumed. A factor of 2.0 would indicate for every watt consumed by IT equipment, another watt is required by support equipment (such as air conditioning, lighting, or other).
Most data centers today consider a target value of 1.5 good, with some companies such as Google trying to drive their PUE below 1.2 – an industry benchmark.
Other data centers are not even at the point where they can collect meaningful PUE data. The previous Google link has an extended description of data collection methodology, which is a great introduction to the concept. The Uptime Institute of course has a large amount of support materials. And a handy Bong search reveals another 995,000 results on the topic. No reason why any data center operator should be in the dark or uniformed on the topic.
So let’s use a simple PUE example and carbon calculation to determine the effect of a poor PUE:
Let’s start with a 4 MW data center. The data center currently has a PUE of 3.0, meaning of the 4 MW of power consumed within the data center 3MW are consumed by support materials, and 1MW by actual IT equipment. In California, using the carbon calculator, this would return 357 tons of carbon produced by the IT equipment and 1071 tons of carbon produced by support equipment such as air conditioning, lights, poorly maintained electrical equipment, etc., etc., etc…
1071 tons of carbon each month, possibly generated by waste which could be controlled through better design, management, and operations in our data centers. Most commercial data centers are in the 4~10MW range. Scary.
The US Department of Energy recently did an audit entitled “Department of Energy Efforts to Manage Information technology in an Energy-Efficient and Environmentally Responsible Manner,” which highlights the fact even tightly regulated agencies within the US Government have ample room for improvement.
“We concluded that Headquarters programs offices (which are part of the Department of Energy’s Common Operating Environment) as well as field sites had not developed and/or implemented policies and procedures necessary to ensure that information technology equipment and supporting infrastructure was operated in an energy-efficient manner and in a way that minimized impact on the environment.” (OAS-RA-09-03)
What Can We Do?
The easiest thing to do is quickly replace all traditional lighting with low power draw LED lamps, and only use the lamps when human beings are actually within the data center space working. Lights generate a tremendous amount of heat, and consume a tremendous amount of electricity. Heat=air-conditioning load if that wasn’t already obvious. Completely wasted power, and completely unnecessary production of carbon. If you are in a 10,000sqft data center, you may have 100 lighting fixtures in the room. Turn them off.
If your data center requires security cameras 24×7, consider using dual-mode cameras that have low light vision capability.
Place blanking panels in all cabinets. Considering removing all open racks from your data center unless you are using them for passive cabling, cross-connects, or very low power equipment. Consider using hot or cold aisle containment models for your cabinet lineups. Lots of debate on the merits of hot aisle containment vs. cold aisle containment, but the bottom line is that cool air going into a server makes the server run better, reduces the electrical draw on fans, and increases the value of every watt applied to your data center.
Consider this – if you have 10 servers using a total of 1920 watts (120v with a 20 amp breaker <at 16 amps draw>), that gives you the potential of running those 10 servers at full specification draw. That includes internal fans which start as needed to keep internal components cool enough to operate within equipment thresholds. If the server is running hot, then you are using your full 192 watts per server. If the server is running with cool air on the intake side, no hot air recirculation producing heat on the circuit boards, then you can reasonably expect to reduce the electrical draw on that component.
If you are able to reduce the actual draw each server consumes by 30~40% by removing hot air recirculation and keeping the supply side cool, then you may be able to add additional servers to the cabinet and increase your potential processing capacity for each breaker and cabinet by another 30~40%. This will definitely increase your efficiency, cost you less in electricity and power, give you additional processing potential.
Sources of Information
Quite a few sources of information, beyond the Bing search are available to help IT managers and data center managers. APC probably has the most comprehensive library of white papers supporting the data center discussion (although like all commercial vendors, you will see a few references to their own hardware and solutions). HP also has several great, and easy to understand white papers, including one of the best reviewed entitled “Optimizing facility operation in high density data center environments” – a step-by-step guide in deploying an efficient data center.
The Bing search will give you more data than you will ever be able to absorb, however the good news is that it is a great way to read through individual experiences, including both success stories and horror stories. Learn through other’s experiences, and start on the road to both reducing your carbon footprint, as well as getting the most out of your data center or data center installation.
Give us your opinions and experiences designing and implementing the green data center – leave a comment and let others learn from you too!
John Savageau, Long Beach
Rob Bernard knows green. As the Chief Environmental Strategist at Microsoft he walks the talk of reducing our carbon footprint, and evangelizing the impact of our actions on both the environment and quality of life. Our quality of life, and the quality of life others on the planet wish to enjoy.
At Microsoft we are committed to software and technology innovations that help people and organizations around the world improve the environment. Our goal is to reduce the impact of our operations and products, and to be a leader in environmental responsibility.
(From Rob Bernard’s presentation at Data Center Dynamics, 6 Aug 09, Bellevue, WA)
Rob told the story of his first week at Microsoft. In their Redmond campus, Microsoft provided logo Styrofoam coffee cups to both visitors and employees. Lots of cups. Almost two million Styrofoam cups a year ended up in the trash.
Immediately prior to joining Microsoft, Rob had taken his family on a short trip to Oregon, where he stopped for a coffee break, and noticed the barrister provided him a paper coffee cup that included a printed notice the cup would bio-degrade within 30 days of use. Dust to dust. And about the same price as the Styrofoam cups. Those environmentally unfriendly landfilling non-biodegradable Styrofoam coffee cups.
Needless to say, Microsoft is now using biodegradable coffee cups, made from recycled paper stock.
The Green Telephone
Rob gave another example of simple things we can do. Oddly, he was not evangelizing Microsoft products, but rather talking to us as one planet resident to another planet resident. He gave the example of telephones, computers, and video. Most of us have a telephone plugged into the wall at home, next to a desktop or personal computer, in the same room as a television set.
Rob simply explained that he has now unplugged the telephone and television, and uses all three services off a lower power draw “Energy Star” computer. No more need to burn electricity to power redundant utilities within the house.
Microsoft Carbon Production
Microsoft is not perfect. In fact Rob noted as a company they had produced more than 936,000 tons of carbon in 2008. This is considered a grossly unsatisfactory condition for a company such as Microsoft, which employs some of the greatest minds in the world. Now Microsoft is on a corporate search and destroy mission to seek out and eliminate waste. Not only internally, but also to provide the lessons learned in the quest to reduce their negative impact on the planet to everybody. Kind of “open source” green.
Part of the philosophy is to lead by example, within the Microsoft campus, and stress to employees that everything learned at campus may be transferrable to their personal lives and homes.
Act with Transparency, Let Employees Inspire
- Understand your impact
- Share and borrow best practice
- Employees lead by example
- Compostable dishware
- Connector Bus (employee transportation from home or “park and rides”)
- Kitchen Grease (contribute to bio-deisel)
- Help individuals drive change
- Support employee engagement
- Measure, measure, measure
(From Rob Bernard’s presentation at Data Center Dynamics, 6 Aug 09, Bellevue, WA)
He actually believes, evangelizes, and strongly urges Microsoft employees to live the talk.
The Data Center Challenge
Rob advised the delegates the US Government is preparing to study the potential of taxing data center operators who consume too much utility power. He went on to urge data center operators to aggressively attack the existing inefficiencies of data center designs, and start a structured approach to rethink, rebuild, and redesign our approach to data centers.
Do you use blanking panels to reduce inter-cabinet hot air recirculation? Are you working on consolidating individual applications into server-based applications? Do you really understand the implications of running high powered computer and server systems which only use 5~10% of their CPU and disk capacity? When possible, do you insist on buying and deploying “Energy Star” equipment, for, well,… for everything?
One problem many data center operators have is they really don’t even know how much energy their data center, much less individual components of the data center, is actually using. As much as we’ve seen it in the news, most data center operators have not even attempted to calculate their Power Utilization Efficiency (PUE) rating or factor. That is the equation that shows how much power you consume for support services in the data center vs. actual power being applied to IT equipment and operations.
Bottom line is how do we fix problems when we have not even audited our equipment and power consumption? We’ve got to get smart. A data center drawing 10 megawatts of power is producing a creepy amount of carbon, so we better start taking it seriously. And oh yeah, the government is going to eventually regulate our industry (since some data centers consume nearly as much energy as the city of Fresno), and penalize those data center operators who cannot prove their efficient use of power.
Consider the Cloud
It is here. It is working. It helps consolidate inefficient data centers into efficient data centers, eliminating much of the unused processing and storage capacity we insist on burning our limited CAPEX to fund. If we can bring our cloud utilization up to 80% through virtualization of existing stand alone server systems, well – we will recover considerable operational and capital expenses by eliminating hardware that consumes electricity, space, and costs a lot to purchase.
Smaller companies can gain even larger benefit by outsourcing their processing and storage to commercial cloud Infrastructure and Software as a service (IaaS/SaaS) providers, eliminating their need to operate a data center – period.
Rob Sells the Audience
Throughout his presentation the audience remained silent, fixed on his words. It is easy to listen to a man who not only knows his material cold, but also projects an enthusiasm which reaches into the soul of everybody present. And those who were squeezing into the back of the room to hear more of his ideas, stories, and visions of a greener future.
I am sold, as were a couple hundred other conference participants. I want to be green, and will not only try to bring more conscientious effort to my personal life, but also become a micro-evangelist in my company. And a macro-evangelist to my industry.
Rob’s website is http://www.microsoft.com/environment
John Savageau, Long Beach
Green technology and green living are nearly as popular in the world of buzz words as solving global hunger and “i-Everything.” Some cities take the topic more seriously. Looking at the Long Beach Press-Telegram on July 2nd, 2009, of four headlines, three dealt with green projects and green initiatives within the city and Port of Long Beach.
Long Beach is an important city, not only to the Los Angeles area, but also to the entire United States. With the adjacent Port of Los Angeles, Long Beach Port is the largest port facility in North America, and among the largest in the world. In addition, Long Beach sits on top of the Wilmington Oil Field, producing more than 15 million barrels of crude oil each year.
And yet, those of us who live in Long Beach find it one of the most exciting cities in the area, if not the country. Why? In addition to the urban renewal programs, Long Beach is a leader in green technologies and policies ranging from setting new global standards in the port, to world-renown desalination projects, to innovations in the Wilmington Oil Field that make an often maligned industry a source of pride for the city.
Cleaning the Ports
While the Port facilities may account for a large percentage of the pollutants covering the greater LA Basin, Long Beach is taking creative and positive steps to reduce the impact of container ships and diesel trucks on the environment.
Construction of the Alameda Corridor, a 20 mile largely underground train line connecting the port facilities to cargo distribution facilities in downtown Los Angeles (the Intermodal Container Transfer Facility), took hundreds of diesel trucks off the road, and further increased efficiency offloading and onloading cargo at the port. From the Los Angeles distribution center, cargo is further sent on to destinations throughout the United States and Canada, using much cleaner rail systems. Current development projects are focused on replacement of existing train locomotives with electric trains, further reducing the impact of container transfer in and out of the port.
Another recent innovation within the port of Long Beach is the new “Dockside Power System,” which allows ships visiting the port to plug into electrical systems provide at dockside power stations, allowing container ships to use electrical auxiliary power systems, rather than continuing to burn diesel while docked at port. The LA Times reports that “emissions reductions amount to 50%, even when factoring in pollution created by power plants in generating the electricity.”
Water Desalination Projects
The Long Beach Desalination project started in 1996 with a federal grant authorizing funding for construction of a pilot plant pumping 9000 gallons per day. With innovations patented by the Long Beach Water Department (invented by Diem Voung, Asst GM at the LBWD) called the “Long Beach Method,” the city has perfected a two stage nano-filtration process which reduces the amount of energy required to desalinate water by up to 30%.
Long Beach’s current desalination project called the “Long Beach Seawater Desalination Research & Development Facility” is the largest project of its kind in the United States, producing more than 300,000 gallons per day. The output from this facility will reduce the city of Long Beach’s need for Colorado River water by more than 15%.
To be honest, there are many other desalination projects in cities lining the coast, including Huntington Beach, San Diego, Oxnard, and others. However Long Beach has provided, and will continue to provide, strong leadership in global desalination initiatives.
Of the “green” headlines in the LB Press-Telegram mentioned above, one story does discuss a grant of nearly $3,000,000 in stimulus funds to further develop Long Beach’s desalination technology and innovations.
Another great project within the city is the use of reclaimed water. In 2008 Long Beach provided more than 1.5 billion gallons of reclaimed or non-potable water to various users for both landscaping, as well as “grey” water usage in air conditioning systems used within commercial properties.
The reclaimed water project is using water from many different sources, including water runoff from storms, barrier water from the Los Angeles River, and other waste water which would normally run into the Pacific Ocean.
The Wilmington Oil Field
There is no real way to have a completely clean environment when you are dealing with oil. Not only the waste surrounding drilling and pumping oil, but also the process of refining oil creates a tremendous environmental mess. The area starting in Long Beach, and passing through Carson to El Segundo supply much of the refined oil used by California drivers, and drivers in surrounding states.
Until we, as a culture, further embrace transportation which does not require the use of fossil fuels, we will not have a clean environment. The one concession Long Beach has made to reducing the negative impact of pumping oil from the rich Wilmington Oil Field, which underlies much of the area from San Pedro to Seal Beach, is some beautification of the oil pumping islands in San Pedro Bay.
The photo on the top of this page shows Island Grissom near the Shoreline Drive area of Long Beach, in sharp contrast to the historic photo immediately above showing the old Long Beach and Signal Hill of past, when the priority was drilling at any cost. The Honolulu Advertiser has mentioned the efforts to produce a more pleasing façade to the oil field as a “prime example of the aesthetic mitigation of technology.”
There are four offshore islands, operated by THUMS (originally named for a consortium consisting of Texaco, Humble, Union, Mobil and Shell — T.H.U.M.S.) are named after astronauts who died in NASA accidents (Freeman, Chaffee, White, and Grissom), and are about 10 acres apiece. From these island drilling platforms, constructed in the 1960s and 70s, more than 1200 oil wells have been placed. (AAPG Explorer)
To its credit, THUMS has never recorded a major oil leak or spill, and claims that “in addition to investing millions of dollars to install pollution limiting equipment throughout our operations, we are working to help meet California’s energy needs by developing a long term supply of clean burning natural gas.” (Frank Komin, THUMS facility Manager)
The Reality and the Future
While the Long Beach Port area continues to be a major source of pollutants in the LA Basin, the city and community continue to push Long Beach to be a leader in not only solving the local problems of a damaged environment, but also use creativity and technology to produce a better future for Long Beach and the world. Of particular note, the desalination projects within Long Beach are a major source of community pride.