Today’s user expectations: It’s all about data access
The past few years have seen a marked shift in users’ expectations of the applications they interact with. And when I say users, I’m speaking pretty broadly. A user can be anything from an end user of a shopping site to a customer support engineer using a ticketing system or a business analyst looking for insights to drive their company forward. Whoever these users are, one thing is for sure: they want greater access to data than ever before, and that has big implications for the future of database technology.
The topic boils down to three themes. Let’s dive in!
1. Fast access to dataÂ
In the not-too-distant past, data reporting took a long time. In the best-case scenario you might wait 30 minutes or so, and quite often the process would take hours. But today, due in part to the Internet and mobile applications that we’ve all become accustomed to in our personal lives, users won’t accept slowness – and neither will businesses that need answers quickly. Fast access to data is a major driving factor behind the need for new data platforms to support access to large, ever-growing data sets.
2. Flexible exploration of data
In addition to speed, people want options. Say you’re a support analyst and you’re working on a ticketing system. You’d like to find a closed ticket that has some characteristics in common with a ticket that’s open now; after all, there may be an existing solution you can apply. You don’t want to be bound by a fixed view into that data, or a fixed dashboard. The same idea holds true for consumer shopping experiences; users expect to be able to conduct instantaneous, multifaceted searches: say, brand, price, specific technical features, and availability at nearby brick-and-mortar locations. Anytime you have a large data set that is active and dynamic, and with a high volume of inserts happening on it, these types of flexible, ad hoc searches push standard database solutions to the breaking point.
3. Access to both current and historical data
The final piece of the puzzle is real-time access to both historical and current data. Say a business analyst needs to see the influence new data has on a long-running trend. They need to be able to graph information from several years ago against what’s happening today. And if an anomaly appears, they need to be able to drill down into it, and have a look at the full detail of individual records. The old approach of doing materialized views that summarized information on a daily basis, for instance, is no longer practical. You have to be able to complement that with the ability to delve into specific transactions, no matter how old they may be.Â
What does it all mean?
When you add those three expectations together – speedy access, ad hoc query exploration, and simultaneous access to current and historical data – you’re exceeding the capabilities of traditional data access approaches. In the old days, with smaller data sets and lower expectations, you could get away with stuffing your regular OLTP database with data and allowing ad hoc queries to run on it. Or you could use a separate data warehouse or data mart, where you have batch jobs copying data into a historical store, resulting in a bifurcation of the way you query your current data and your old data (without the ability to get real-time insights).Â
So, how do you attack these requirements without investing in a big system that has separate ETL and data-transformation processes? The way forward is hybrid transactional/analytical processing – where a relational database can scale out and support both transactional and near-real-time analytical workloads. (For instance, MariaDB Platform is based on workload-optimized storage engines – row-based for transactional, columnar for analytical – and can scale without sacrificing schemas, transactions or SQL.)Â
Now that you understand the big three data access challenges, learn about the database requirements for addressing those challenges in this related post: The Place Between Transactions and Analytics.