What’s in Store for Storage in 2017

638
  • Autonomous Driving and Artificial Intelligence: Storage to Steer Straight and Smart

By 2020, ten million cars on the road are expected to have self-driving features, reflecting the continuing move toward an IoT-based world and advances in artificial intelligence (AI). Designing autonomous vehicles requires the ability to harness massive amounts of camera and sensor data, analyze it and apply AI technology, and – as with any AI application – the more data it has, the smarter it can become. In the past, one of the biggest hurdles for AI was processing power, but storage has increasingly become the limiting factor. Traditional, general-purpose storage solutions can’t keep up with the performance requirements or provide the long-term data retention and access capabilities needed at an affordable price point. As a result, more enterprises will turn to specialized storage and data management solutions that can meet these challenges.

  • Corporate Video: A Moving Picture is Worth a 100,000 Words or More

The video will take on a much larger role in the activities and processes of more and more organizations, such as enhancing teaching and guarding against malpractice claims in hospitals, improving quality control in manufacturing facilities and analyzing buyer behavior in retail stores. Companies will also continue to make video a bigger part of their training and service initiatives, echoing the trend of YouTube videos replacing written manuals in the consumer space. In fact, while the written word is far from obsolete, in many areas video will increasingly become the de facto communication platform. All of this will require greater collaboration between IT and line-of-business owners to ensure they have the storage and data management infrastructure that can support a video-dominant world as efficiently and cost-effectively as possible.

  • Still the Bright Shiny Object (Storage)?

Object storage, with its extreme scalability and durability capabilities, provides access to large-scale data volumes at a lower price point than primary disk storage while also avoiding the increasing RAID rebuild times associated with high-capacity disks. Object storage has played a foundational role in public cloud services for some time now, but expectations that it would become the dominant technology for large-scale data retention have not been borne out. In many cases, users are realizing that an intelligent file system and the latest tape storage technology can provide equivalent, or even better, performance at a lower cost. Moving forward, therefore, enterprises will primarily deploy object storage as the basis for their private clouds, with tape maintaining its role as the optimal technology for long-term, low-cost archive of large-scale unstructured data.

  • Tale of the Tape: A Story that’s Far from Ending

There’s little question that tape’s role in backup has continued to decline, but tape as a storage medium is far from dead. The ever-increasing volume and value of unstructured data have brought greater attention to the importance of preserving and protecting this data in a robust, low-cost archive, and, as noted above, the tape is still the best technology for long-term retention, with significant advances in performance, capacity and ecosystem integration being introduced every few years. That’s why organizations dealing with massive amounts of unstructured data – in genomics, academic research, video surveillance and entertainment, to name just a few examples – will continue to make tape a key part of their storage infrastructure and why even enterprises that have vowed to move away from tape will find themselves reversing course. At the same time, as large public cloud providers expand their role as major storage players and compete to drive down pricing, they will increase their reliance on tape to make their business models work – yes, although hidden behind the clouds, tape has long been used in this realm because the economics simply can’t be beat.

  • Avoiding Cloudy Islands

Industry analyst firm IDC forecasts that IT infrastructure spending for public and private cloud environments will increase at compound annual growth rates of approximately 19% and 10%, respectively, from 2015 to 2020. As more data moves to the public cloud, expect to see greater adoption of a dual-cloud vendor policy – just as enterprises maintain at least two sources for critical infrastructure components to avoid vendor lock-in and the loss of flexibility this entails, companies are recognizing the need to extend this approach to the cloud services they buy. A key challenge, however, will be linking their public clouds – and private clouds – together so they can seamlessly provision cloud resources and move workloads among the clouds. Nobody wants to return to a world of storage silos and the management problems this would entail. As a result, organizations will increasingly look for storage and data management solutions that are not only multi-site but also multi-cloud. In addition, they will begin to look to the cloud as the location where their data management tools can be hosted in a multi-site environment.

  • Video Resolution for the New Year

4K video is not new. Cinemas began projecting in 4K in 2011, and Netflix began streaming some shows in 4K two years ago. However, while much of the media and entertainment industry has moved to 4K and even begun looking beyond to higher resolution video, adoption of 4K in other markets has been relatively slow. One of the main reasons why is that enterprises are still struggling to figure out how they can manage this type of data. The 4k video presents challenges in terms of not only much larger file sizes but also much higher data rates – the ability to ingest and deliver 4K data in a smooth, predictive stream without dropping frames or creating other distortions is often beyond the capabilities of existing storage infrastructures. With video taking on a much larger role in the enterprise (as noted in an earlier prediction), organizations will increasingly need to address this gap. The key to doing so without replacing the entire storage infrastructure will be leveraging high-performance solutions optimized for video that can integrate into the existing infrastructure.

  • HPC: Adapting to New Demands

With the advent of clustered computing becoming common in nearly every enterprise, organizations are rapidly creating more data that they want to use strategically to make better and faster business decisions – for example, regarding new investments, more efficient operations, higher product quality or improved customer service. As a result, enterprises are increasingly looking to the high-performance computing industry for best practices and technologies to provide the performance, capacity, and data management capabilities needed at this new scale. HPC providers can help enterprises to integrate open source initiatives, determine how to leverage low-cost hardware platforms in the most optimal way and simplify the management of the data in a single namespace with simple tiering. However, it’s essential that the HPC industry moves beyond its traditional focus on binary data and account for the tremendous growth of richer unstructured data.

About Quantum

Quantum is a leading expert in scale-out storage, archive and data protection, providing solutions for capturing, sharing and preserving digital assets over the entire data lifecycle. From small businesses to major enterprises, more than 100,000 customers have trusted Quantum to address their most demanding data workflow challenges. Quantum’s end-to-end, tiered storage foundation enables customers to maximize the value of their data by making it accessible whenever and wherever needed, retaining it indefinitely and reducing total cost and complexity. See how at www.quantum.com/customerstories

LEAVE A REPLY