Big data is big business.
But as more data becomes available to collect, analyze, interpret and ultimately capitalize on, one big challenge remains the banks of servers and the security and cooling systems needed to keep those data storage centers up and running.
While some big tech companies have turned to icy, far-flung locations like Sweden and Norway and even underground tunnels for data storage, one Pleasanton-based startup is turning to U.S. ports and waterways for a new storage solution.
Nautilus Data Technologies has built a working prototype vessel to serve as a floating, water-cooled data center.
“I was working with Microsoft in the ’90s and saw how data centers were getting bigger and more critical, and knew there was a better way to construct them,” said founder and Chief Technology Officer Arnold Magcale, who coupled his data security and U.S. Navy experience to launch the business.
“The problems we see with data centers are only getting bigger — from space to environmental impact to price and security — I knew we were on the wrong path and had to do something,” Magcale said.
The data center company says it is essentially using Navy technology that is more than 70 years old. Nautilus now has a working prototype vessel that it says can be tied to any secure port or waterway in the U.S. and can be built in one-third the time and hold the same data as land structures three times the size.
And by using water, rather than chemical coolants, as a cooling method, Magcale hopes more companies will see that their data doesn’t need to negatively impact the environment.
The startup has completed its proof of concept and is talking to potential customers that either want their own custom vessels or want to share space on communal vessels. With a few anchor tenants, Nautilus is hoping to get its first customer vessels up and running in the later part of 2016.
The vessels aren’t floating out at sea — they’re designed to be set at ports and in areas overseen by national security, and Nautilus is hiring U.S. veterans trained in securing sensitive material and data for the added security most clients are seeking.
“We know that the time it takes to get data centers up and running is often two or three years,” said CEO Jim Connaughton. “So we are hoping our six- to nine-month timetable will bring in the clients while the energy, water and environmental benefits keep them excited in our new solution to a $175 billion-a-year problem of data center development and maintenance.”
Nautilus Data Technologies is making history. The California-based company recently became the first in the world to successfully launch a waterborne data center prototype—charting a new course for future data management. The Nautilus Data Technologies data center solution incorporates advanced maritime technology with patent pending infrastructure systems and predictive software controls to create a revolutionary floating data center system— one that significantly lowers cost, increases rack density, improves energy efficiency, and eliminates water consumption, while also providing rapid scalability and global mobility.
Nautilus’ approach reflects the experience and vision of founder, Arnold Magcale, a U.S. Navy Special Forces veteran and early pioneer of Silicon Valley data centers. Magcale combined military security knowledge and maritime electronics expertise with decades of data center technology leadership, to reimagine data center infrastructure from the ground up—creating the most significant data center industry advancement in decades.
“With the explosion of big data, it became increasingly clear that the traditional data center model was unsustainable— an outdated archetype that wastes massive amounts of energy and water at an unacceptable rate,” said Arnold Magcale, Nautilus Data Technologies Founder and CTO. “Nautilus Data Technologies is transforming the data center industry— providing a waterborne solution that dramatically reduces energy use and carbon emissions while eliminating all water consumption. It’s a model of efficiency—the model for the future.”
Nautilus Data Technologies is currently constructing its first commercial production facility at a Northern California Naval Shipyard. The commercial vessel is based on the company’s original proof-of-concept prototype, successfully validated in 2015 in association with Applied Materials and the U.S. Navy. The facility is being built on a military-grade, Coast Guard approved barge, and will be securely docked at a well-protected Northern California port.
Nautilus Data Technologies has a dozen pending patents, with its cooling technology among the more striking innovations. The technology leverages the natural body of water below the barge to cool the facility, safely re-circulating all of the water back where it originated. No chemicals are used during the process, ensuring the system provides a sustainable and environmentally friendly alternative to traditional land-based facilities.
“Nautilus Data Technologies is transforming the data center industry—providing a waterborne solution that dramatically reduces energy use and carbon emissions while eliminating all water consumption. It’s a model of efficiency—the model for the future”
The modular marine-based design is highly responsive to evolving market demands, and can easily be docked in emerging market locations wanting access to modern IT services—further helping to democratize information. The waterborne structure can also be built in half the time it takes to construct a land-based facility, allowing greater flexibility and rapid global deployment.
Nautilus Data Centers will provide hosting and colocation services in a multi-tenant environment with a standard configuration of up to 800 server racks, as well as custom-built data center vessels for dedicated customers and uses. The company’s advanced cooling systems will also be able to handle high-density and high performance computing requirements. Nautilus has broken the 30 kW per rack barrier and can scale environments up to 75 kW per rack, and potentially higher.
The Nautilus’s Data Center Infrastructure Management (DCIM) solution employs artificial intelligence for automated, lights out operation, further enhancing the productivity of operations and power management. A predictive DCIM will monitor the operations to dynamically control the data center’s humidity, temperature, and rack levels, without any human intervention, and provide customers detailed access to the operational status of their systems.
The company’s first commercial vessel will be commissioned in the summer of 2016 and Nautilus is currently signing customers who want to be on board for the historic launch. Additional vessels are in the production pipeline for 2017 and beyond. At Nautilus Data Technologies, it’s full speed ahead.
By: Kim Brunhuber
Canadian Broadcasting Company
At first glance, the old barge docked at the Mare Island Naval Shipyard looks like it’s just taking a breather on its slow journey to the scrap yard.
Most of the browned, pitted metal panels that form the deck are slightly warped, creating hundreds of tiny reflective pools. A few have been removed, creating man-sized holes that nearly landed this unwary journalist in the bowels of the ship. Below deck, machines roar and hiss as workers scoop out the barge’s insides. Over the next couple of months, the vessel will be transformed into a state-of-the-art world-first. Other companies have tried to build their own versions of this ship, and until now, all have failed.
“You’re seeing the future,” says Kirk Horton. “You’re seeing the revolution.”
The vice-president of Nautilus Data Technologies leads me across the deck carefully, avoiding the puddles and holes. The age and condition of the barge is part of the plan; the company says it intends to only retrofit pre-owned vessels as part of its commitment to environmental sustainability. Horton says in about five months, this ship — certified as sea-worthy by the U.S. Coast Guard — will be fully operational. Already companies have bought space for their servers.
“This is the world’s first highly efficient, highly sustainable waterborne data centre,” says Horton.
Whether they’re the size of airplane hangars or tiny closets tucked away in the basement, data centres house rows and rows of disk arrays and routers — the building blocks of the internet — that store and transmit our data.
What makes this data centre so special isn’t just that it’s in the ocean, but the fact that it will be cooled by the very water upon which it floats.
“So this is our heat exchange,” says Arnold Magcale, the company’s CEO. We step into a small shack next to the barge which houses a miniature version of what will be installed on the barge. It was used to prove that his concept actually works.
He points to the pipes that run behind the server racks. The water in the pipes absorbs heat, then is expelled back in the ocean while cool water is drawn in. A virtuous circle, he says, that has passed every environmental assessment so far.
“What we’re doing here is moving water versus moving air, which is five times more efficient,” says Magcale.
It can save companies as much as 40 per cent on their energy bill, adds Horton.
“With the advent of big data, as cloud technology further progresses, you’re going to see more and more advanced IT technology — the server infrastructure, the equipment, the storage devices — they will continue to draw more and more power,” says Horton.
The Nautilus barge — located about 40 kilometres northeast of San Francisco — is an attempt to solve a problem most people didn’t even know existed.
Every time you update your Facebook profile, every time you email a friend, every time you stream your favourite show, somewhere in a dark room in a building far away, lights flicker, servers whir and air conditioners roar. Every year, we use more data. Every year, the number of data centres grows. And every year, those data centres use more electricity.
“Data centres are the new modern-day factories,” says Mukesh Khattar, technical executive with the Electric Power Research Institute, an organization funded by the electrical utility industry.
In 2000, before the prevalence of streaming companies like Netflix, data centres accounted for one per cent of U.S. power consumption, he says. By 2015, that number tripled.
“That number’s increasing continuously,” says Khattar. “And you can see that. Everybody has a cellphone these days, everybody has a portable device. All of these devices are connected in the back-end to a data centre.”
Inside the data centres, the servers generate so much heat that if they’re not kept cool, they melt.
“For every unit of energy that goes into powering IT in an average data centre, you need another unit of energy to cool the data centre down,” says Pierre Delforge, director of high-tech sector efficiency at the National Resources Defense Council, a non-profit environmental advocacy group.
If you think of each data centre as a plane taking off, Delforge says, only about 10 per cent of the seats — the servers — are used. That’s because data centres are designed to handle “peak load,” which is the maximum amount of traffic they’re expected to experience, like a rush of customers on Cyber Monday.
“The problem is the other 364 days in the year, they’re still running all the servers,” says Delforge. “They’re not powering them down when they’re not needed.”
Out there, he says, is a robot army close to 15 million strong, waiting for orders that rarely come.
Khattar believes it’s because those who run corporate data centres aren’t responsible for how much energy their IT systems use; they’re judged on reliability and speed.
“Do you want to wait for a few seconds to get your picture downloaded? No!” says Khattar. “We want it instantaneously. And companies are just responding to that.”
You might think the villain in the black hat would be internet giants like Facebook. To handle the company’s one trillion page views each month, Facebook operates several server farms, some of which are about the size of six football fields. But big companies have big energy bills, so they have an incentive to cut the amount of power their data centres use.
“The estimate — I think it was about a year ago — was that we saved over $2 billion,” says Facebook’s director of sustainability Bill Weihl. “Which means it’s well worth investing the time and money.”
That investment led to the development of new, stripped-down, highly efficient servers that produce less heat. Weihl says three of Facebook’s data centres run entirely on clean energy. At the other end of the high-tech spectrum, the company uses something called “free cooling.” Basically … windows.
“We open up the window on one side and blow the hot air out. We open up the door on the other side and bring in cool air from outside,” says Weihl. “The amount of energy we’ve saved is the equivalent of the energy used in a year by about 78,000 U.S. homes, and avoided emissions [are] the same as taking about 95,000 cars off the road.”
Experts like Delforge say it may be counter-intuitive, but the Facebooks of the world aren’t really the problem.
“The cloud-computing companies like Facebook, Google, and others, they’re only responsible for collectively about five per cent of all data centre energy use,” Delforge says. “Individually they use a lot of energy, but there’s relatively few of them compared to all the small server closets and small rooms that you find in virtually every floor of every office building in the country.”
Those account for about half of the energy used by data centres, he says.
“This is a very old data centre,” says Khattar, taking me on a short tour of the classroom-sized data centre being phased out by his own organization.
To keep their servers cool, most companies with small data centres just blast the A/C. The people who run the majority of IT departments aren’t aware that the industry standard has changed. New data shows that the air used to cool servers can actually be about 11 degrees Celsius (20 F) warmer.
“The mechanical equipment — the hardware — doesn’t require you to be as cold as in the past,” Khattar says. “You can use much warmer air … and your system will work very efficiently under those conditions.”
And there’s more good news. While older data centres require as much energy to cool as they do to operate, new ones only need one-tenth of the energy.
“The newer ones being built by the large companies are already more efficient,” Khattar says. “There’s a big, big improvement happening in the infrastructure side.”
But Delforge is still skeptical.
“At the moment we’re seeing a few leaders in the high-tech industry and other sectors pioneer new technology that can significantly reduce data-centre energy, but we need more than just a few shining examples,” Delforge says. “We need the majority — and eventually all data-centre operators — to use these best practices.”
Even if they do, there’s another challenge: the Jevons paradox. Nineteenth-century economist William Stanley Jevons observed that when technology improves efficiency, consumption doesn’t go down, it goes up.
And that, Delforge fears, seems to be the case with data centres.
“Progress is being outpaced by the rapid growth of the industry,” he says.
By: Tina Rose
Microsoft Underwater Data Center
“The sea is everything” -Jules Verne
Microsoft, believing that the sea holds the key to their future, has tested a self-contained data center that operates far below the surface of the ocean. The key to this study is the millions that it will save on the industry’s most expensive problem, air-conditioning.
Thousands of computer servers generate a lot of heat, and continuing to maintain them effectively and efficiently is the reason for considering water as a cooling medium. Too much heat causes servers to crash, whereas, the possibility of running underwater servers could not only cool them, but cause them to run even faster.
Code-named Project Natick, the answer might lead to giant steel tubes running fiber optic cables on the bottom of the ocean floor. Another option would be to capture the ocean currents with smaller turbines, encapsulated in small jellybean type shapes that would generate the electricity needed for cooling.
With the exponential growth of technologies including the Internet of Things, centralized computing will be a bigger demand in the future. With more than 100 data centers currently, Microsoft is spending more than $15 billion to add more to their global data systems.
While Microsoft is looking to underwater locations to meet their growing computing needs, there are other companies who have found other unusual locations and ways to build data centers, while taking advantage of differing resources.
The SuperNap Data Center, a $5 billion dollar, 2 million square foot facility in Michigan is located in the former Steelcase office building. Switch built the SuperNap Data Center in Grand Rapids within the 7 story pyramid shaped building that features a glass and granite exterior. It will be one of the largest data centers found in the eastern U.S.
Nautilus Data Technologies have developed floating data centers turning to the sea as well. They have recently announced their first project The Waterborne Data Center. They believe that their approach to cooling their data will save Americans who are spending currently over $13 billion a year. According to Arnold Magcale, CEO and co-founder, Nautilus Data Technologies, “The Nautilus proof of concept prototype exceeded all expectations – validating how our waterborne approach will provide the most cost effective, energy efficient and environmentally sustainable data center on the market.”
At a more clandestine location, but also incorporates water as a cooling mechanism, Academica, designed a hidden underground data center to use pumped seawater to cool the servers. An added bonus is that the heat generated from the cooling process, provides heat to over 500 local homes before being regenerated back to the sea.
“The sea is only the embodiment of a supernatural and wonderful existence.” -Jules Verne
By: Tom Coughlin
Data Centers are a big consumer of electricity; even a moderately sized data center can consume the same amount of energy as a small city. With the growing volume of digital data generated by consumers and businesses that must be stored in these data centers and with the vast amounts of data that will be created by the Internet of Things and other emerging ways to generate large amounts of information, data centers will become more numerous and consume more power.
There are a number of innovative approaches to the design of data centers as well as the way that their services are sold, to make the most efficient and cost-effective offerings. There are also new developments that could allow greater flexibility and wider use of these increasingly critical elements to our digital civilization.
Aligned Data Centers are providing a “pay-for-use” data center in Plano, Texas. This 300,000 square foot, $300 M, 30 megawatt data center complex provides cutting edge energy efficiencies and plans to utilize a very efficient operating system that is expected to provide tremendous savings in energy and water consumption.
Traditional co-location models lock customers into a long-term contract for data center electrical power, that they may not use. This new model allows customers to pay for the electricity that they actually use. This eliminates the need to forecast IT demand and provide real-time control of required storage and processor capacity. As a result tenants are able to reduce energy and resource waste (power, cooling and water) and thus lower their operating costs.
A San Francisco Bay area data center company, Nautilus Data Technologies, is offering waterborne data centers that use the water surrounding the vessel to remove data center heat. Their Waterborne Data Center solution prototype should allow enterprises to dramatically reduce the costs of computing and storage while operating a environmentally sustainable data center.
A Tale of Two Data Centers
According to Nautilus a single mid-size data center can consume 130 million gallons of water a year (enough to supply nearly 2,000 people). The Nautilus approach consumes no water by putting disaster-resistance, marine-grade data centers on Coast Gauard certified barges in secure ports and using naturally-cooled water around the barge to reduce the temperature in the facility with no evaporation of water.
They report that the increase in temperature in the heating water is 2-4 degrees Fahrenheit and poses minimal environmental impact to the surrounding water. The waterborne vessel can be moved as needed and works in salty, brackish or fresh water. The annual savings in electricity costs for this new data center model are estimated at more than $4 M per year. Carbon emission reductions are estimated to be more than 19,000 tons per year.
By: News Staff
Data center infrastructure provider Nautilus Data Technologies announced Thursday it has named Kirk Horton as vice president of sales and marketing.
Horton previously was vice president of channel sales for IPR, a provider of cloud and co-location services. He was also vice president of enterprise sales and channels for Telx. Prior to Telx, Horton served in executive sales and channel roles at data center and technology companies Globix (now QTS Realty Trust), Digital Island and SpeedERA (Akamai).
Horton is a past member and advisory board member of the Technology Channel Association. He attended California Polytechnic State University, San Luis Obispo, and has a bachelor’s degree in international business administration/management from Notre Dame de Namur University.
The Pleasanton, Calif.-based Nautilus Data Technologies recently showed its 30,000-square-foot Waterborne Data Center fixed to a moored barge. The company has dubbed it the world’s first floating data center.
By: George Leopold
A startup called Nautilus Data Technologies said it is building the first commercial “waterborne” data center at a naval shipyard at Mare Island Naval Complex north of San Francisco. The company said deployment at the “secure port” is scheduled for next year.
Nautilus CEO Arnold Magcale is a former member of the U.S. Navy Special Forces. The startup said it worked with the Navy’s Space and Naval Warfare Systems Command and the Naval Postgraduate School to develop its data center prototype.
Data center operators have adopted a range of novel approaches to reducing energy consumption as a way to boost a key industry metric: power usage effectiveness, or PUE. (Datacenter operators aimed for a PUE rating around 1; Nautilus said its prototype design has been validated at 1.045).
As data center PUE ratings hit a wall, more radical designs are being consider and, significantly, attracting venture funding. Google initially proposed the concept of a floating data center. Nautilus, based in Pleasanton, Calif., said it is building the first commercial “data barge” based on a prototype moored off the California naval complex.
Even the company acknowledges early doubts about the inherent risks involved in floating millions of dollars worth of computing and other IT gear. Jay Kerley, chief information officer of Applied Materials, a major supplier to the semiconductor industry, admitted initial skepticism before becoming an advisory board member. Scott McNealy, co-founder of Sun Microsystems, also advises the startup.
Applied Materials, environmental services supplier Veolia and the U.S. Navy provided equipment for the prototype datacenter launched last summer.
Asked about potential customers, a Nautilus executive replied in an email that “we are in talks with some of the industry’s leading technology companies,” but he declined to name them. The startup expects to begin announcing customers in early 2016.
To convince skeptics, Nautilus stressed that it places its floating data centers on “ocean-worthy barges” that meet U.S. Coast Guard specifications and exceed maritime standards. The barges are former military or construction ships with an expected lifetime of up to 50 years. The platforms are then “moored to piers in protected ports,” the company said in a statement announcing construction of the first commercial floating datacenter.
Added a Nautilus executive: “They will be safely secured in protected ports, moored to land and supported by underwater stabilizers.” The data barges are built with 13 airtight compartments. The company said the platform would “maintain buoyancy” even if five compartments were flooded. Nautilus also is considering deploying pontoons as “added protection while providing additional space and stability to the barge.”
Why not just continue building data centers on terra firma and avoid the risk of being swamped? The startup, which has so far attracted $25 million in venture funding, claims its data barge design consumes up to 30 percent less energy than traditional datacenters while saving an estimated 130 million gallons of water annually.
Its patented technology uses the water on which the barge floats for cooling. Another selling point for data center operators in drought-stricken California: The data barge consumes no water since all used for cooling is recycled.
The floating facility occupies 30,000 square feet on a 230-foot barge, but the company claims that novel design efficiencies make the data barge equal to an 80,000-square-foot data center on land. The barge can be configured with up to 800 server racks and deployed in less than six months.
The company also asserts that a floating data centers moored in “carefully selected military grade ports” offer greater security and the ability to withstand natural disasters like earthquakes.
The startup’s proprietary data center infrastructure management (DCIM) technologies are used to automate IT infrastructure. The DCIM approach also leverages artificial intelligence and provides what the company calls military-grade security.
Magcale touts the data barge concept as among the “most significant data center advances in decades.” For now, the startup has the private investment and brain trust to back up that claim, along with support from the Navy.
By: Mike Wheatley
We’ve seen a number of novel approaches in the data center industry aimed at reducing energy consumption, from building facilities above the Arctic Circle to using water misters for evaporative cooling. Now, a startup called Nautilus Data Technologies Inc. from Pleasanton, California, has put forward a radical new idea – the floating data center.
Last week, Nautilus announced it’s already building the world’s first commercial “data barge” at the Mare Island Naval Complex north of San Francisco. Enterprise Tech reports that the company actually came out of stealth earlier this year, with the promise of higher energy efficiency than standard data centers, competitive pricing, and greater mobility and security due to its waterborne nature. Once construction on its first floating data center is completed, the barge will be moved to a “secure port” and begin looking for customers to rent out some of its data center capacity too.
But why a floating data center? The answer is simple – Nautilus touts a number of benefits, including significant power savings and also water cooling savings. According to the startup, its data barge will consume up to 30 percent less energy than traditional data centers while saving on up to 130 million gallons of water per year.
The project has attracted some keen interest, with Silicon Valley-based firms like A10 Networks and Applied Materials, as well as the U.S. Navy, all expressing an interest in renting some of its data center capacity.
Despite the novelty of having a data center floating on water, Kirk Horton, the VP of sales and marketing at Nautilus, told Data Center Knowledge that customers were more interested in how the company achieves such high levels of energy and how they can deliver the kind of costs savings they promise.
“They see this massive 230-foot barge, and the whole notion of this being on water is out of their mind,” Horton said, adding that its data barges can be deployed in any port that meets its power, network and security requirements and moved to a new location when necessary.
What this means is that Nautilus’ data barges could be an ideal solution to meet the demand for greater data center capacity in “edge” locations to serve new, growing markets.
Nautilus is set to commission its first floating data center by the first quarter of next year, built atop of a ship called the Eli M. Once it’s ready to set sail, the barge will be moored off of an undisclosed location on the West Coast.
By: Yevgeniy Sverdlik
Data Center Knowledge
After they get over the initial “huh?” when they hear about the idea of building a data center on a floating barge, infrastructure execs for big companies usually want to know how Nautilus Data Technologies achieves the kind of energy efficiency it claims its floating data center offers, Kirk Horton, the startup’s VP of sales and marketing, said.
“That whole water factor completely evaporates once the client comes onto the construction site,” he said. “They see this massive 230-foot barge, and the whole notion of this being on water is out of their mind.”
Nautilus came out of stealth earlier this year, announcing its plan to build an 8 MW floating colocation data center, promising high energy efficiency, competitive pricing, and, due to its unusual approach to real estate, mobility and higher security than data centers on land. The company has completed a smaller proof of concept, has raised $25 million in private equity, and is now building its first commercial facility in a US Navy port on Mare Island, a peninsula in Vallejo, California, about 20 miles northeast of San Francisco.
Nautilus staff have taken many IT execs on tours of the prototype and the construction site on the barge, the company’s execs said. While they weren’t at liberty to name all the organizations who were interested in the project, those who participated in the proof of concept include Silicon Valley’s A10 Networks and Applied Materials, as well as the US Navy itself, according to Horton.
Like many other federal agencies, the Navy is in data center consolidation mode and actively looking for alternatives for its massive data center fleet. The department had more than 200 data centers around 2013, when it started its consolidation efforts.
Last year, it set a goal of reducing the number of its data centers to 20 or fewer and outsourcing 75 percent of its data center needs to commercial providers, according to a conference presentation earlier this year by John Pope, director of the Navy’s current Data Center and Application Optimization program.
Nautilus execs couldn’t disclose any specifics about the company’s engagement with the Navy, but “we have top-secret clearance with the Navy, and we’re doing some sensitive work for them,” Arnold Magcale, the company’s co-founder and CEO, said.
The reason the department has so many data centers is that it has traditionally built them to support warfighter operations wherever they are, according to Pope’s slides. “Warfighters require world-wide, secure, reliable, and timely information,” one of the slides read. “Multiple independent data centers grew up organically to support the warfighter.”
One of the reasons the Navy may be interested in floating data centers is their mobility. A Nautilus data center can be deployed in any port that meets the security, power, and network connectivity requirements and moved elsewhere when it is no longer required.
In the public sector, there’s demand for the ability to deploy data center capacity in places where it’s not easy to build brick-and-mortar facilities. One example would be so-called “edge” locations, where the amount of people connected to the internet is growing and content and service providers need data center capacity close to those locations to serve new growing markets, Horton said.
Magcale said he expects to commission the company’s first floating data center, being built atop a vessel named Eli M, after his mother, in the first quarter of 2016. Once completed, the barge will be moved to another location on the West Coast, which the execs also could not disclose.
Nautilus claims its facility will use up to 50 percent less energy than a traditional data center of comparable size. The energy savings will come from its patent-pending data center infrastructure management software and a cooling system that uses sea water, a feature that will be especially welcome in drought-ridden California.