Announcements & News
- RDSS program announced
- Paper acceptance notification moved to: May 8th
- Paper submission deadline extended to: April 19th
Motivation and scope
As Cloud Computing becomes more
pervasive, existing database management systems (DBMS)
are offered as cloud data services and new cloud data
services are emerging. New data services cater to new
web-scale workloads and Big Data. The cost of testing
database systems has been traditionally a large part of
the total development cost.
Building reliable data services, testing,
benchmarking and monitoring quality of service in a
production environment is at least an equally
challenging and costly undertaking. There is a need for
novel and revolutionary ideas for the current situation
to improve in the future.
The Reliable Data Services & Systems Workshop (RDSS) replaces the Workshop on Testing Database Systems. Similarly to the DBTest workshop, RDSS aims to expose to the academic community the challenges faced by industry practitioners related to building and testing reliable data services. The long term goal is to devise new techniques that reduce the cost and time to build, test and tune data services so that vendors can spend more time and energy on actual innovations. We see the challenges in building and testing data services as a superset of what practitioners face in traditional DBMS testing. RDSS’s goal is to is shift the focus of the conversation and research ideas to data services while remaining open to the areas of interest covered by DBTest in the past.
Topics of Interest
- Testing and resilience in
service-oriented architectures
- Testing issues in multi-tenant
database systems and cloud database systems
- Testing issues in large-scale
analytics systems (e.g., Hadoop)
- Testing the reliability and
availability of database services and systems
- Metrics for quality of
service, elasticity, scale, predictability and
workload performance
- Metrics and validation
techniques related to service level agreements
- Designs, algorithms and
techniques that improve the reliability and
testability of data services
- Testing the efficiency of
adaptive policies and components
- Testing data services,
database systems, and database applications
- Generation of test artifacts
(e.g., test data, test queries)
- Maximizing code coverage of
database systems/services/applications
- Improving the user experience
of data services and systems
- Testing and designing systems
that are robust to estimation inaccuracies
- Identifying performance
bottlenecks
- Robust query processing
- Security and vulnerability
testing
- War stories and vision papers