Tuesday, October 8, 2024

Top 6 Database Challenges and Solutions

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Database administrators and data architects can encounter a number of challenges when administering systems with different requirements and behavioral patterns. At the June 2023 Pure//Accelerate conference, Pure Storage’s Principal Solutions Manager Andrew Sillifant laid out six of the most common database challenges and his solutions for them.

1. Managing Scale within Cost Constraints

According to Statista, the volume of data and information created is increasing by about 19 percent each year, while others report storage growth figures far in excess of that amount.

“We are seeing data grow at 30 percent or more annually,” said Greg Johnson, Executive Director of Global Electronic Trading Services at JP Morgan Chase. “We hit the wall and were unable to keep up with traditional storage.”

To gain the speed and efficiency necessary to keep up with data expansion, the company switched to all-flash storage arrays that can scale out or scale up as demand requires.

“Power critical applications with latencies as low as 150 microseconds are best served by flash storage,” Sillifant said. “Always-on deduplication and compression features can enable more databases to run on fewer platforms.”

Sillifant said the Pure Storage Flash Array and FlashBlade arrays provide such benefits. Some are best for top performance and others have been engineered for storage managers to cram more capacity onto a smaller space while still providing good performance, while scale-out file and object platforms are best for demanding high-bandwidth, high-capacity use cases.

2. Maintaining Consistent Performance

Oracle’s Cloud Business Group data reveals that database administrators (DBA) spend an average of 90 percent of their time on maintenance tasks. The best solution to reducing the maintenance burden is to improve reporting and support it with analytics and artificial intelligence (AI) so it is easier to discover storage or other bottlenecks inhibiting database operations.

3. Data Protection and Availability

Data protection, disaster recovery, and maintaining high availability for databases are persistent issues DBAs are facing. According to the Uptime Institute, 80 percent of data center managers and operators have experienced some type of outage in the past three years.

To boost data protection and disaster recovery, Sillifant recommended volume and filesystem snapshots that can serve as point-in-time images of database contents. For immutability, Pure Storage SafeMode snapshots give additional policy-driven protection to ensure that storage objects cannot be deleted. Another safeguard is continuous replication of volumes across longer distances and symmetric active/active bidirectional replication to achieve high availability.

4. Management of Data Pipelines

As data sources grow, so do the processes that support them. DBAs wrestle with complexity that makes management a chore. DBAs and storage managers need as many metrics as possible to be able to cut through this complexity and efficiently manage their data pipelines.

Some of these are provided by vendors such as Splunk and Oracle. Others are included within storage arrays. Pure, for example, has OpenMetrics exporters for its FlashArray and FlashBlade systems that allow IT staff to build common dashboards for multiple personas using off-the-shelf tools like Prometheus and Grafana.

With containers growing so much in popularity, DBAs and storage managers also need tools to measure and manage their containerized assets and databases.

“If database queries are running slowly, for example, database personnel typically have no idea what is happening in the storage layer and vice versa,” said Sillifant. “There has traditionally been a lack of insight into each other’s worlds.”

He suggested Portworx Kubernetes storage to address the problems inherent in monitoring data within containers and being able to share information and resolve issues. Metrics can be gathered from a number of layers (including the storage volume layer) and collated into a single reporting dashboard for data within containers.

“You can build common dashboards for databases and storage to correlate behavior and determine where problems lie,” said Sillifant. “Every time you solve such problems rapidly, you make the data more valuable to the business.”

5. Data Control

Organizations handling international data or dealing within specific geographies such as the European Union, California, or New Zealand must ensure that it is not placed at risk by sharing it across borders. Data residency, sovereignty, and localization have become more important than ever, each of which come under the heading of control of data. Whether it is in the cloud or on-premises, DBAs must pay attention to where data is stored and where it is going.

The solution in this case is granular and accurate location tracking of all data as to where it is being stored and in what volumes. Those dealing with reporting and audits can then verify easily that data privacy policies are being observed and data isn’t straying from where it is supposed to reside.

6. Data Migration

According to estimates, it can take anywhere from six to 24 months to set up and configure complex server architectures and cloud-native services when huge amounts of storage are involved. Migrating data from one database or server or cloud to another often eats up much of this time. When a large volume of data is involved, get ready for long migration delays.

Many of the features noted here help ease the data migration burden. Asynchronous replication and snapshots simplify the process of moving data from on-premises to the cloud and back. Snapshots eliminate the hours or even days needed to transfer the data from large databases and storage volumes to another location. Sillifant recommended Portworx for end-to-end data management of containers, which includes the ability to move their data from anywhere to anywhere.

Modern Databases Need Modern Platforms

Modern databases are generally larger and more complex than ever. They must be able to exist in or interface with on-premises, multi-cloud, and hybrid environments. To do so efficiently and well, they must be supported by storage platforms and tools that offer the speed, agility and flexibility needed to keep up with the pace of modern business.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles