Copy pasting a student’s inquiry to support:
My name is Jamie Tabone and I am an undergraduate student reading for a degree in Software Development at the University of Malta. As part of my dissertation, I am currently conducting a research study which aims to compare two web stacks of technology for their ability to scale linearly with respect to resource usage as throughput increases.
The first stack is the MEAN stack which mainly consist of Node.js and MongoDB. The other stack is a proposed Erlang-based stack which mainly consist of Cowboy as the web server and Couchbase (knowing it is partially built in Erlang) as the database server.
During the tests, the performance of Couchbase looks as in the charts below. Do you have any idea why Couchbase follows no specific resource usage pattern as load increases? In fact, even before the actual test was started, the Couchbase server had the same pattern of resource usage.
Note that the test scenario simulates a web application with high level of writes to the database. In fact, a user document (inc. name, surname, username, password, dob and about) is added to the server for each request. The test is 20 minutes long and the number of requests per second increase by 30 every 30 seconds, starting with a load of one request per second, up to 1170 requests per second.
The Resources for the database server: 7 GB of RAM and 4 CPU cores.
CPU Usage :
Thank you. Your help will be much appreciated.