In the relatively new world of Software defined storage (SDS), using best practices can increase application deployment in flexibility and capability, yet this is not without its challenges. SDS could be defined as a set of software capabilities that automatically manage data locally and globally, providing breakthrough speed in data access, easier administration and the ability to scale technology infrastructures quickly and more cost-effectively as data volumes expand. In addition, these advances can work with any company’s storage systems to provide automated and virtualized storage.
However, application deployment works differently from various SDS vendor. This means attention must be paid to the many facets of software defined storage app deployment.
SDS can cover virtually all application, workloads and use case types. Depending on the application and workload, users should select the SDS platform most appropriate. That means taking a close look at the file, block or object data in use.
“Know your applications workloads performance, availability, capacity, economic (PACE) attributes and requirements as well as hardware device options that need to be aligned with your environment,” said Greg Schulz, analyst with StorageIO Group.
Within a content repository, a client might use object storage. However, for database workloads (be it traditional or new style databases), SDS platforms that support block data would be preferred. Options include bare metal servers, storage arrays, virtual machines and the cloud.
Ideally, providers of SDS infrastructure should provide a menu of services that cater to different types of applications. It’s important to understand which service is most appropriate given the profile of the application’s workload. Both the provider of infrastructure and the tenant should perform ongoing capacity planning to ensure both services and applications meet their service level agreements. Those services should be backed up with sufficient CPU, memory and I/O networking connectivity resources.
Monitoring Application Performance in SDS
Software application deployment benefits materially from performance monitoring. The best approach is to monitor as close as possible to where users of the applications will notice. Combine this kind of monitoring with internal metrics to verify that users are happy with the application. Crash reporting should also be implemented to detect bad deployments early
Fundamentally, SDS enables storage administrators to scale and manage their infrastructure much more granularly than with traditional storage. But if the necessary knowledge is not in place before implementation, application deployment disasters are possible. The best way to avoid serious trouble is to plan, size, and profile the application by considering data growth in the near term.
Upfront architectural design consulting: SDS users should also take advantage of upfront architectural design consulting, as well as training on how to maintain the system over its lifecycle.
Simulate failure: Similarly, it’s wise to simulate failure conditions in a controlled way to understand how applications handle them, and select application architectures that emphasize detection and remediation instead of prevention (MTTR over MTBF).
In addition, many SDS applications offer a rich set of data integrity and geo-replication features to help mitigate against natural disasters and human errors. In that sense, SDS is no different from any other type of application or storage. A plan must be in place to recover from any minor or major events that could shut down the systems or result in data loss.
Be aware of application reaction: On a smaller scale, storage managers should be aware of how the application will react to adverse conditions. For example, what happens when the application in your SDS environment does not get any response for 0.1 seconds or 1 second? Those designing the SDS system should map their application requirements with a strong focus on latency and bandwidth limitations.
The virtual data plane in software defined storage offers an array of advantages in monitoring and control.
Software Deployment Best Practices
Disasters in application design are more than theoretical. One financial services company is on record for going bust in only 45 minutes. A failed deployment caused it to lose nearly $400 million. That, of course, is an extreme case. To avoid even minor fallout from a software defined storage deployment there are many best practices and tips that others have learned over the years.
Open source developers and software providers often post best practice documents, software application deployment tips and implementation guides on their websites. These should be studied in advance of any implementation.
Here are some of the top tips passed on by the experts:
Understand the big picture: Users must understand their workloads, applications, and use cases so they can deploy the right SDS.
Be aware of your underlying infrastructure: Be prepared to address the infrastructure that lies underneath the SDS or find a reseller or vendor to do that. Software defined application deployment may hit snags if it is implemented on top of older platforms that are in need of an update.
Support after sale: Pay careful attention to support after the sale. Who is responsible if there is a problem? Failure to take this into account could mean having to call several vendors during a problem with each engaging in finger pointing, versus just calling one vendor who is willing to take charge.
The limits of marketing: Don’t blindly go along with software defined storage marketing. Instead, focus on what you need to define a storage system, solution or service that fits your environment, your applications and your workload and brings about the outcomes and business benefits you demand. It’s always far simpler to find software that fits your environment than trying to fit your environment to a software package that an executive ordered to be implemented as it is hot or trending.
Hardware matters Keep in mind that while hardware requires software, on the other side of the coin, all software requires hardware somewhere. It may be the case that an SDS solution does not require a specific type of proprietary hardware anymore, but that doesn’t necessarily mean that it will run well on any hardware.
So many variations: Those selecting software defined applications should realize that SDS software is not all the same. There is a wealth of different packages and applications for various storage usage scenarios. The key is to map your own environment and needs, and then pick a solution that aligns and supports that specific application workload.
Know your environment: A fine point of differentiation is who is defining the storage. If you buy into a vendor’s vision, you may be letting them define your software and application requirements. But they don’t know your environment. So you should be the one defining what software should be doing. Once you have achieved that understanding, go about the business of finding a vendors SDS approach that most closely matches what you are planning to accomplish.
Match your existing vendor: For those deploying in a virtualized environment, it may be best to converge your software defined application deployment around a particular virtual platform. A VMware shop should opt for vSAN and other VMware related tools to enable a software defined vision. Similarly, those primarily using Microsoft Windows and Hyper-V may be better advised to base their SDS deployment around Hyper-V tools such as Storage Spaces Direct (S2D).
Work with APIs: To obtain the biggest benefit of SDS technologies, applications need to work closely with related APIs so they can take advantage of all the features they have to offer, such as object storage. This can save an immense amount of work in direct integration. Instead of doing so, applications may be better served by a framework that provides the necessary abstractions to benefit from API features without having to directly modify the application. As such, the SDS approach offers greater versatility this way since an array of services can be used without tying each application to any one service in particular.
Start with the familiar: Initially, stick to areas of familiarity when it comes to software defined storage deployment. For example, those running databases and that have experience of running a traditional SAN should focus on block storage-based SDS application deployment projects. Why? They have already attained some level of expertise with block storage so they don’t need to change their processes when engaged in software defined storage deployment. There are also scale-out NAS approaches to SDS that can run on hardware from a great many suppliers. Some deployments are said to be scalable to tens of PBs.
Decouple from the past: Take advantage of management flexibility. As software-defined storage decouples the storage controller software that manages traditional storage array systems from the underlying physical storage, and a greater level of flexibility is facilitated. Those managing the environment, therefore, should decouple their thinking from old approaches to management. The software-based model greatly increases deployment and management flexibility, enabling them to choose whatever hardware platform and management tools they prefer.