Why Next-Generation Database Applications Are Critical in Meeting the Ever-Changing Market Conditions

Database technology has been around for decades but as the use of technology in business and society has grown, database technology has evolved dramatically to meet the changing needs.

Stuart Fisher, Regional Vice President Asia Pacific and Japan, Couchbase, stated in an interview with DSA that next-generation database applications and data flows are critical for high-performance, agility, and flexibility to changing market conditions. Here’s why.

He said that the market and the dynamics of businesses, governments, and society, in general, are clearly moving in the direction of digital transformation. The world is changing at a rapid pace, and businesses must keep up with the massive amount of data being collected and transacted on a daily, hourly, and minute-by-minute basis.

"As we move closer to the Internet of Things (IoT), the collection of data through e-commerce, health apps, messaging apps, etc., we're looking to real-time information feeds. Therefore, there is a real case to be made that infrastructure needs to be updated, and a modern database will be critical to this. It must facilitate the flexible and agile development of new applications," he explained, adding that this is especially the case when we look at the traditional historical context of databases and where they came from.

Stuart mentioned that over the years, we have dealt with very structured relationship databases that had very structured tables, inputs, and how they could accept and transact with that data. This, however, has not kept pace with what we're dealing with on a daily basis, which is multiple data inputs from a variety of IoT devices.

"We've seen a massive shift from centralised to decentralised computing, with mobile devices causing or creating the vast majority of data that we use today," he said. Accepting data inputs from those locations is critical, hence it is driving the massive migration of data to the cloud, and according to Stuart, “There is no better infrastructure in terms of cost, resilience, and ROI than a hyperscale in the public cloud.”

When asked why multiple database migration options for moving data from an on-premises or private cloud solution to a public cloud provider should be available, he stated that migration concerns are at the top of every developer's priority list.

“There are numerous areas that revolve around availability, failover, security, and so on. When it comes to migration options, we're moving quickly away from the past, where vendors would lock in a customer or a customer would be locked into one vendor,” Stuart explained.

In addition, he stated that organisations must transition from a closed structured environment to an open-source environment with fewer boundaries and limitations, the ability to ingest data from any source, and the provision of a powerful computational analysis platform.

Although next-generation database applications provide numerous benefits, it’s not always smooth sailing for organisations that are implementing them. According to Stuart, many of the challenges are the result of businesses' reliance on legacy relationship databases and rapid application development. Legacy relationship databases will not suffice because they lack the agility and effectiveness required for the type of new digital outreach required by businesses. In terms of rapid application development, you need the entire infrastructure to support it in terms of new outreach, new ways of engaging with the audience, and rapid growth and scale.

This is where the benefits of hyperscale and cloud truly shine. "Enterprises can select a hyperscaler. They can do it ‘as-a-Service,’ without committing to large upfront investments, skills, infrastructure requirements, or licence obligations. They can test in an open-source environment and then marry that to the outreach application that they're attempting to create," Stuart explained.

This enables businesses to take a hybrid approach to their database. They can keep using their core database and Relational Database Management System (RDBMS). Then, as these microservices are developed, they will be able to move them to the cloud.

He concluded that the next-generation modern database should be simple to manage, particularly if it is automated because having automated capabilities built into modern databases makes them easier to manage, configure, and set up. As these microservices establish themselves in different areas of the business or in different geographies, it becomes easier but more powerful to ensure enterprises can control and manage their data from anywhere.

You might also like
Most comment
share us your thought

0 Comment Log in or register to post comments