During a recent webinar (2020-11-05), Craig Shallahamer wondered if the use of resource manager can increase the throughput rate when a system is running at 100% CPU.
The previous investigation concentrated on showing the scheduling effect of having a CDB resoure plan. The summary of this investigation can be found here.
The AWR Difference Report showed:
Resource Plan | Avg Active Users | Elapsed Time (min) | DB time (min) |
---|---|---|---|
None | 4.9 | 16.0 | 77.9 |
JAR_CDB_PLAN | 4.8 | 14.1 | 67.6 |
%Diff | -1.2 | -12.2 | -13.2 |
There is some improvement shown. However, is this difference statistically significant?
I think I should restrict myself to the simplest test case:
- Oracle Enterprise Edition 19C.
- Single PDB
- No CDB Resource Plan
- Running on VirtualBox
- Vary CPUs on VirtualBox from one (1) to six (6) (restricted by underlying hardware)
- No RAC
- Whether to use HammerORA to generate load
- Whether to use Ansible to generate test case