BY Edgar Walters
What do a warehouse in North Austin and a building at Angelo State University have in common? They hold trillions of bytes of data about some of Texans’ most sensitive information, including health and education records.
The Texas Legislature created the twin data centers in 2005 to consolidate disparate data management operations at dozens of state agencies. But since then, as government programs churned out more and more electronic information about health care, highways, public schools and other key services, the cost to operate the facilities has ballooned.
This session, lawmakers are considering an overhaul of how the state uses its data centers, with an eye toward private tech companies like Amazon and Microsoft that own private networks of remote servers known as a “cloud.” Proponents say hiring such a firm to be the official keeper of much of the state’s data could save millions of dollars and modernize vulnerable government tech infrastructure. But detractors say the current set-up is working fine and that any kind of structural change would be laborious, expensive and potentially risky.
A decade ago, it cost $278 million to run the centers over the state’s two-year budget cycle; under the current spending plan, it costs about $489 million to operate them.
“What can we do to try to reduce those costs?” state Rep. Giovanni Capriglione, R-Southlake, asked state information officers at a recent committee hearing. “Today there’s a lot of options in terms of what we can do with the data center.”
Though some lawmakers have bristled at the idea of private companies storing Texans’ personal information in far-flung locations, proponents of the reforms say data security will be at the forefront of any decision they make.
“We are not signing a contract with anybody until we have a chance to find out what’s really going on here,” said state Sen. Jane Nelson, a Flower Mound Republican who chairs the Senate Finance Committee. “The discussion about whether we do cloud and all that, we can have that discussion. I want to make sure — A, we’re protecting that information, [and] B, that we are keeping that information in Texas.”
Much of the data center debate this session has centered on a $1.5 billion deal that the Texas Department of Information Resources made with a French-headquartered company, Atos, to operate the facilities. In recent committee hearings, lawmakers have encouraged the agency to look at data storage options offered by cloud-computing service providers.
“I don’t understand why we’re so far behind here on this,” said state Rep. Donna Howard at a recent legislative hearing on data centers. The Austin Democrat noted that her city’s — and Texas’— reputation as a tech hub doesn’t jibe with the state government still “doing Medicaid on Excel spreadsheets.”
In 2012, the state’s technology agency signed a deal that made Xerox responsible for maintaining the data center’s mainframe computers and servers, as well as overseeing some printing and mail services. Three years later, Xerox backed out of the contract, and the board of the Department of Information Resources approved a deal for Atos to take over.
That did not sit well with Nelson, who in November sent the agency a sternly worded letter.
“I am concerned that a contract of this magnitude — worth $1.5 billion — can be reassigned from one vendor to another” without oversight from the Legislature, she wrote. “In a time of heightened focus on data security, I also have concerns with trusting our most sensitive and important data to a foreign corporation whose main operations may not be fully subject to federal and state laws.”
Atos’ contract expires next year, and Nelson has asked the Department of Information Resources not to sign another data center management contract until the Legislature has a chance to weigh in.
Last week, Nelson filed a bill that would require state agencies to consider cloud-based storage options when creating new government software applications. Another bill, authored by Capriglione, would create a technology modernization fund that agencies could use to pay for a transition to cloud-computing services.
State agencies already have some authority to bypass the data center and hire outside companies for certain data management projects, but only if the agency gets permission from the Department of Information Resources.
In an interview, Capriglione said he had heard from state officials, whom he declined to name, who recounted their frustrations working with a state data center they said was expensive and cumbersome.
“Here’s the reality — anyone that’s looking at this has come to the conclusion that cloud-based technology is significantly more secure, more resilient, more future-proof, than any sort of in-house data center client service,” Capriglione said.
Doug Robinson, the executive director of the National Association of State Chief Information Officers, said many states beyond Texas are discussing changes to their data centers. Ten states store data on mainframe computers that are located off-premise, according to the group. Others pursue a “hybrid model” with a mix of on-premise and remote storage. A survey found that more than half of states plan to downsize their own data centers.
“The next question is, once you’ve continued to look at that model, does it make more sense to actually decide that as a state you’re not going to be in the data center business at all,” Robinson said.
The pros and cons of relying more heavily on cloud storage providers are similar to the trade-offs associated with the outsourcing of any service, said Yevgeniy Sverdlik, editor-in-chief of Data Center Knowledge, a website that tracks the industry. “Instead of making a sandwich on your own, you have to buy a packaged sandwich,” he said.
“What you get, though, with cloud providers, is more up-to-date infrastructure, and they’re able to, because of the amount of capital they spend on this infrastructure, they can update a lot more quickly than anyone else can in-house,” Sverdlik said. “If these [Texas data center] facilities were built in 2005, by industry standards they’re getting pretty up there in age.”
An audit last year of a government program built to store birth records sheds light on the conundrum Texas agencies face when it comes to data storage.
Public health officials in 2015 received roughly $15 million to redesign their birth and death records management system. Staff at the Department of State Health Services, wanting to stretch the funding as far as possible, hired Genesis Systems, a private firm that offered a cheaper data-hosting plan than the state data center.
But doing so without permission from the state’s information technology agency was against state policy, and when the Department of Information Resources discovered the contract, it required the project to be redesigned and hosted at its own data centers. State auditors found the change-up increased the project’s cost by $1.8 million and delayed its intended roll-out date by one year.
The state’s technology officers don’t dispute that data storage plans sold by tech companies on the private market are often cheaper than the state’s homegrown data center offerings. “Storage in the cloud, for example, is generally less expensive than storage in our [on-premise] data center environment,” Todd Kimbriel, the state’s chief information officer, said at a recent legislative hearing. The Department of Information Resources did not respond to follow-up questions for this article.
But there are political and logistical obstacles to abandoning the data-center model. For one, the state has sunk massive amounts of money to build and maintain its facilities in Austin and San Angelo. “It’s very expensive to walk away from that investment,” Kimbriel said.
That the hardware and software used in Texas government tends to run at least a decade behind the private sector’s represents another barrier. Many tech companies charge exorbitant rates to handle outdated software. For example, Kimbriel said the state keeps roughly 70 servers running on a Windows 2003 operating system that is no longer supported by Microsoft. Because those servers host “mission critical” information, Kimbriel said, they must be housed at the state’s data center but kept isolated from other servers — and Microsoft charges a “bounty” to provide basic support.
Five or 10 years ago, governments were more likely to consider on-premise data storage to be safer from cyberattacks, but as the federal government has strengthened regulatory requirements and the cloud storage market has matured, many officials consider private-sector options to be more secure, Robinson said. The Department of Information Resources recently told lawmakers it sees hackers attempt to compromise state data about 2 billion times per day.
When debating a move to the cloud, “the security aspect of it, I think, is not a player anymore,” said Paul Ferrill, a consultant who has written about data centers. In 2013, for example, the CIA awarded Amazon Web Services a $600 million cloud-computing contract. The U.S. Department of Defense is mulling a $10 billion cloud contract.
“I really don’t see security as a pro or a con,” Ferrill said. “The biggest pro and or con is going to be cost.”
Disclosure: Amazon Web Services and Microsoft have been financial supporters of The Texas Tribune, a nonprofit, nonpartisan news organization that is funded in part by donations from members, foundations and corporate sponsors. Financial supporters play no role in the Tribune’s journalism. Find a complete list of them here.