Powering the Public Sector’s Next-Generation Data Center with Flash Storage Industry Perspective
Powering the Public Sector’s Next-Generation Data Center with Flash Storage
1
Introduction The next-generation data center will be a more agile, scalable and automated infrastructure, enabling enterprise users to respond quickly to changing requirements and expand capacity and performance on demand without downtime. To that end, data center managers need storage systems that will let them deploy new applications faster and with greater agility. Flash drives are more efficient and faster than spinning disk drives, allowing government agencies to store more data in less space and access it faster – in effect, letting users access information as they need it, in the fashion they need it in a costeffective way.
Federal data centers are undergoing a long overdue makeover. Agencies’ efforts to consolidate and optimize data centers are vital components of the federal government’s move toward an information technology portfolio that is more efficient, effective, secure and better able to deliver world-class services to the public in this evolving digital era. The next-generation data center must be more agile, scalable and automated than traditional data centers, allowing the government workforce to respond quickly to changes and expand capacity and performance on demand without interruption. As government data continues to grow exponentially, data center managers and storage administrators must move beyond outdated storage technology and siloed point solutions to manage storage issues and workloads in a way that helps the workforce derive real meaning from data. In the past, data center managers would have to add many extra spinning disks to get the performance they needed, but would then wind up adding more storage than needed, which added additional cooling and power needs. Now, data center managers need storage systems that will let them deploy new applications faster and with greater agility. Flash drives are more efficient and faster than spinning disk drives, allowing government agencies to store more data in less space and access it faster. Today, flash storage is past the disruptor phase and is the standard storage technology enterprises and government agencies are putting into next-generation data centers. In this industry perspective, GovLoop, NetApp, a leader in flash storage solutions, and its value-added reseller partner, ThunderCat Technology, will discuss ways in which federal agencies can optimize their data centers.
2
Industry Perspective
What Do Agencies Expect from Data Centers Today? The government workforce — just like workers in the commercial sector — want fast access to information, any time, from any place and on any device. “They’re looking for their data centers to be much more adaptable,” which means the ability to move workloads around where they are needed, said Kurt Steege, CTO of ThunderCat.
Protection and security of data is also vital, especially as government agencies combat current and emerging cyber threats from both internal and external adversaries. As such, agency managers want assurance that their information is isolated and secured from other agencies’ data within multi-tenant environments.
The Next-Generation Data Center Today there are countless applications in the data center driving the need for greater performance. Twenty years ago, in many organizations, there were fewer applications running business operations. Storage administrators could better manage priority applications, ensuring they had the right hardware and computing resources. But as the volume of data and applications used by organizations grew, infrastructure and storage siloes emerged, as each application required its own dedicated hardware to prevent applications residing on the same infrastructure from disrupting one another. The data center became organized into data and application siloes, defined by separate interfaces and separate groups tasked with managing the provisioning and operation of those siloes, said Jeramiah Dooley, Principal Architect in the SolidFire Office of the CTO at NetApp. “The data center of yesterday needed a jumpstart in lots of places,” Dooley said. “It needed a jumpstart in how flexible it could be, and it needed a jumpstart in how automated and how API-driven it could be,” which means that each data center infrastructure component could be provisioned and managed through an application programming interface (API). Most importantly, the data center needed to be able quickly to scale and expand securely in capacity and performance wherever needed. Additionally, the data center must be able to scale in increments, without interrupting any of the workloads that are there, Dooley said. What’s more, public-sector agencies are deploying cloud and virtualization services to speed up performance
and lower costs. With so much new cloud data moving through and the increased number of users utilizing cloud services, the modern data center must be more softwaredefined and automated, with the ability to guarantee performance, data assurance and global efficiencies. A software-defined data center (SDDC) architecture is where all infrastructure components are virtualized and delivered as a service. Control of the data center is fully automated by software, which means component configuration is maintained through intelligent software systems. Security, storage, networking and even the data center are now incorporated in the software-defined technologies realm. Moreover, automating tasks and orchestrating workflows are fundamental if agencies’ service-delivery needs are expected to be met at scale.
Major differences between the next-generation data center and traditional data centers include: • Multi-tenant versus single tenant; • incorporates mixed workloads versus isolated workloads; • based on shared infrastructure versus dedicated infrastructure; • uses a scale-out design versus scale-up; • offers capacity on demand versus preprovisioned capacity; • software defined versus hardware defined; • self-service versus project-based; and • employs automation versus manual administration.
Powering the Public Sector’s Next-Generation Data Center with Flash Storage
3
From Consolidation to Optimization As Dominic Sale with the General Services Administration’s Office of Government-wide Policy wrote in an August 2016 blog post, “Few would debate that if we had the opportunity to build the federal government’s IT infrastructure from the ground up today, it would certainly look very different than the current state.” In 2010, the Office of Management and Budget launched the Federal Data Center Consolidation Initiative (FDCCI) to promote the use of green IT by reducing the overall energy and real estate footprint of government data centers. Other FDCCI objectives focused on reducing the cost of data center hardware, software and operations, increasing the overall IT security posture of the federal government and shifting IT investments to more efficient computing platforms and technologies. The government should be able to save at least $8.1 billion in data center closures through 2019, according to a Government Accountability Office’s report released in March 2016. At the time, the report cited that the government has closed 3,125 of its 11,000 data centers, leading to $2.8 billion in savings under FDCCI. In March 2016, OMB shifted the focus of federal Chief Information Officers from the consolidation of data centers to optimization of data centers under a new policy called the Data Center Optimization Initiative, (DCOI), which supersedes FDCCI.
“The DCOI is an ambitious effort to optimize and innovate more than 10,000 data centers operated by federal agencies,” Sale wrote. In August 2016, GSA announced that the Office of Government-wide Policy would serve as the Managing Partner for DCOI to help federal agencies implement the data center provisions of the Federal Information Technology Acquisition Reform Act (FITARA). “The DCOI is the next logical step in the history of federal data centers, following the Federal Data Center Consolidation Initiative to achieve cost savings through consolidation of redundant federal data centers. With the aggregate cost of operations, changing hosting options, and the ever-increasing need for better information security, optimizing our infrastructure is a sound investment and the right thing to do,” Sale wrote. The DCOI requires agencies to: • Develop and report on their data center strategies; • Transition to more efficient infrastructure, such as cloud services and inter-agency shared services; • Leverage technology advancements to optimize infrastructure; and • Provide quality services for the public good.
Cloud Workloads Will Triple Cloud Workloads Will Triple Over the Next 3 Years Over the Next 3 Years
Cisco predicts that worldwide, “By 2020, 92 percent of workloads will be processed by cloud data centers; 8 Cisco will predicts that worldwide, “By 2020, 92centers. percent” Plus, of workloads will be processed by cloud data centers; 8 percent be processed by traditional data “overall data center workloads will more than double percentfrom will be processed by traditional centers.will ” Plus, “overall data center workloads more than” double (2.6-fold) 2015 to 2020; however, clouddata workloads more than triple (3.2-fold) over thewill same period, (2.6-fold) from 2015 to 2020; however, cloud workloads will more than triple (3.2-fold) over the same period, ” according to the Cisco Global Cloud Index, released in 2016. according to the Cisco Global Cloud Index, released in 2016. http://www.cisco.com/c/dam/en/us/solutions/collateral/service-provider/global-cloud-index-gci/white-paper-c11-738085.pdf http://www.cisco.com/c/dam/en/us/solutions/collateral/service-provider/global-cloud-index-gci/white-paper-c11-738085.pdf
4
Industry Perspective
Flash Storage: From Disruptor to Storage of Choice Flash storage offers high-performance capability without consuming much rack-unit space, so agencies save on capital expenditures, including power usage and cooling requirements – all of which help agencies align with the federal government’s data center consolidation and optimization mandates. Spinning hard disk drives have been the predominant storage technology over the last 30 years, consuming a lot of power and generating a lot of heat. Flash storage runs cooler without the same airflow requirements. Some reports indicate that flash storage consumes as little as 20 percent of the power of a traditional spinning hard drive and read as much as 100 times faster. An all-flash array is a solid-state storage disk system that contains multiple flash memory drives instead of spinning hard disk drives. Flash memory, which has no moving parts, is a type of nonvolatile memory that can be erased and reprogrammed in units of memory called blocks. Flash storage is a variation of erasable programmable read-only memory (EEPROM), which got its name because the memory blocks can be erased in a single action or “flash.” A flash array can transfer data to and from solid state drives (SSDs) much faster than electromechanical disk drives. Flash storage uses electricity to store data in addressable locations on a fixed, thin layer of oxide, so data is retained even when the power is off.
Virtual desktops and databases are the two biggest applications targeted for flash arrays because they generate a lot of random reads. In fact, most of the workloads in data centers today have a highly random, volatile component. Archival storage is probably not suited for flash. While it could be put on flash and analytics could be performed on the data, “from a cost perspective it probably just doesn’t make sense still at this point when you talk about the archival side,” Steege said. “What we’ve done is we’ve taken the ability to read data and write data, and we’ve just made it faster,” Dooley said. Flash is starting to become cost efficient enough that it can be used in lots of places where it wasn’t used before. Organizations are also getting value out of the software that’s running on top of it, such as data deduplication, compression and data replication software. Agencies are looking at issues such as how to scale data and how to protect workloads from one another inside a single storage cluster. “Those are the things that I think are really enabling the federal space — the enterprise space in general – to look at storage as more than just something that fits into the data center and turning it into something that can be useful or relevant in how they deliver services up to their end users,” Dooley said.
3 Tips for a Successful Flash Implementation
1
The architecture upon which flash resides is important.
Storage users have lots of choices these days: Are they going with a traditional hardware-based array, or software-defined storage? As they grapple with issues around how to acquire storage, how to update, replace or refresh that storage, how to scale that storage out, the storage architecture that vendors are using to do that matters a lot, Dooley said.
2
Focus on how flash is being delivered.
Differences in operational efficiencies, programmatic and automated storage underneath a platform can vary from implementation to implementation, even if they’re using the exact same flash.
3
Focus on designing a flexible and adaptable network to take advantage of different workloads. Cloud services, mobile devices, social networks and big-data analytics are driving the growth of data in many organizations, much of it unstructured data. To handle workloads for such applications, agencies will need integrated systems with storage that can scale out, and a network with the ability to move workloads around where they are needed.
Powering the Public Sector’s Next-Generation Data Center with Flash Storage
5
NetApp and ThunderCat: How They Can Help Agencies rely on NetApp expertise for flexible, efficient, and secure data management solutions brought to you by ThunderCat Technology. In partnership, NetApp and ThunderCat offer flash storage solutions suited to meet to meet a multitude of workloads in any storage environment.
NetApp’s SolidFire: Modern data centers require storage that’s as innovative as the business models they must enable. SolidFire’s scale-out, 100% predictable, and fully automated platform gives you total flexibility in all-flash storage. Now you can easily and non-disruptively deploy guaranteed application performance at unlimited scale— no matter how you consume storage.
Conclusion Agencies are focusing on trends like IT-as-a-Service, data analytics, cloud computing and mobility, and, at the same time, tasked with providing the public superior service in this new digital age. Thus, data center mangers must ensure that their facilities are optimized to provide high performance, always-on availability and assurance that agency data is always safe. Agency IT infrastructure must be highly scalable, rapidly accessible to multiple applications and flexible. Data center managers need storage systems that will let them deploy new applications faster and with greater agility. Flash storage is rapidly becoming the standard storage technology enterprises and government agencies are deploying to meet the requirements of the nextgeneration data center.
6
Industry Perspective
About NetApp
About ThunderCat
Government agencies of all levels count on NetApp for software, systems, and services to manage and store their most important asset, their data. With solutions ranging from data protection and recovery to cloud computing, data analytics, and flash solutions, NetApp has become government customers’ top choice for key technologies that drive data center transformation. Top counties, cities, and states count on NetApp and value our teamwork, expertise, and passion for helping them succeed now and into the future. For more information, visit www.netapp.com.
ThunderCat Technology is a service-disabled veteranowned small business that delivers technology products and services to the federal government and Fortune 500 companies. ThunderCat is a value-added reseller that brings an innovative approach to solving customer problems in and around the data center by providing strategies for data storage, networking, cyber security and applications.
About GovLoop GovLoop’s mission is to “connect government to improve government.” We aim to inspire public-sector professionals by serving as the knowledge network for government. GovLoop connects more than 250,000 members, fostering cross-government collaboration, solving common problems and advancing government careers. GovLoop is headquartered in Washington, D.C., with a team of dedicated professionals who share a commitment to connect and improve government. For more information about this report, please reach out to info@govloop.com.
Powering the Public Sector’s Next-Generation Data Center with Flash Storage
7
1152 15th St. NW, Suite 800 Washington, DC 20005 (202) 407-7421 | F: (202) 407-7501 www.govloop.com @govloop
8
Industry Perspective