DAPD journal pa­per published

One journal paper from the UDBMS group at the Department of Computer Science has been published in Distributed and Parallel Databases (DAPD). Please see the details as follows:

Chao Zhang and Jiaheng Lu. "Holistic evaluation in multi-model databases benchmarking." Distributed and Parallel Databases (2019): 1-33.

Abstract A multi-model database (MMDB) is designed to support multiple data models against a single, integrated back-end. Examples of data models include document, graph, relational, and key-value. As more and more platforms are developed to deal with multi-model data, it has become crucial to establish a benchmark for evaluating the performance and usability of MMDBs. In this paper, we propose UniBench, a generic multi-model benchmark for a holistic evaluation of state-of-the-art MMDBs. UniBench consists of a set of mixed data models that mimics a social commerce application, which covers data models including JSON, XML, key-value, tabular, and graph. We propose a three-phase framework to simulate the real-life distributions and develop a multi-model data generator to produce the benchmarking data. Furthermore, in order to generate a comprehensive and unbiased query set, we develop an efficient algorithm to solve a new problem called multi-model parameter curation to judiciously control the query selectivity on diverse models. Finally, the extensive experiments based on the proposed benchmark were performed on four representatives of MMDBs: ArangoDB, OrientDB, AgensGraph and Spark SQL. We provide a comprehensive analysis with respect to internal data representations, multi-model query and transaction processing, and performance results for distributed execution.

Online open access version: https://link.springer.com/article/10.1007/s10619-019-07279-6

UniBench project website: https://www.helsinki.fi/en/researchgroups/unified-database-management-systems-udbms/unibench-towards-benchmarking-multi-model-dbms

UniBench benchmark at Github: https://github.com/HY-UDBMS/UniBench