Just like Protractor is the end to end test case runner for AngularJS to check for functional regressions, this project is a way check for performance regressions while reusing the same test cases.
Install protractor-perf using npm install -g protractor-perf
.
Protractor test cases are re-used to run scenarios where performance needs to be measured. Protractor-perf can be used just like protractor, just that the test-cases need to be instrumented to indicated when to start and stop measuring performance.
Protractor is usually invoked using $ protractor conf.js
. Use $ protractor-perf conf.js
instead to start measuring performance.
The config file is the same configuration file used for protractor tests.
Note: If you run selenium using protractor's webdriver-manager
, you would need to specify seleniumPort
and selenium
keys in the config file, to explicitly specify the port on which the selenium server will run. This port will also be picked up by protractor-perf
. See ./test/conf.js
as an example.
When the instrumented test cases are run using protractor, the code related to performance is a no-op. This way, adding instrumentation does not break your ability to run protractor to just test for functionality.
The test case need to specify when to start and stop measuring performance metrics for a certain scenario. The following code is an example of a test case, with perf code snippets added.
var PerfRunner = require('..');
describe('angularjs homepage todo list', function() {
var perfRunner = new PerfRunner(protractor, browser);
it('should add a todo', function() {
browser.get('http://www.angularjs.org');
perfRunner.start();
element(by.model('todoList.todoText')).sendKeys('write a protractor test');
element(by.css('[value="add"]')).click();
perfRunner.stop();
if (perfRunner.isEnabled) {
expect(perfRunner.getStats('meanFrameTime')).toBeLessThan(60);
};
var todoList = element.all(by.repeater('todo in todoList.todos'));
expect(todoList.count()).toEqual(3);
expect(todoList.get(2).getText()).toEqual('write a protractor test');
});
});
The four statements to note are
- Initialize the Perf monitor using
new ProtractorPerf(protractor, browser)
- To start measuring the perf, use
perf.start()
- Once the scenario that you would like to perf test completes, use
perf.stop()
- Finally, use
perf.getStats('statName')
inexpect
statements to ensure that all the performance metrics are within the acceptable range.
The perf.isEnabled
is needed to ensure that perf metrics are not tested when the test case is run using protractor
directly.
protractor-perf
is based on browser-perf. browser-perf
measures the metrics that can be tested for regressions. Look at browser-perf's wiki page for more information about the project.
Invoke protractor-perf
from a GruntFile as below
module.exports = function(grunt) {
var protractorperf = require('protractor-perf');
grunt.registerTask('protractorperf', function() {
var donerun = this.async();
// Optional config Object that overwrites properties of conf.js.
// Useful to set property values from grunt.option()
var argv = {
selenium: 'http://localhost:54321/wd/hub',
seleniumPort: 54321
};
protractorperf.run('./merci-perf-conf.js', donerun, argv); // config file
});
grunt.registerTask('run', ['protractorperf']);
};